Only a few tasks left before dropping the work-in-progress label of the "Principles of Language Design" talk.
https://soc.me/talks/language-design
(requires browser with JXL support)
Only a few tasks left before dropping the work-in-progress label of the "Principles of Language Design" talk.
https://soc.me/talks/language-design
(requires browser with JXL support)
Rust calling their unchecked blocks "unsafe" is an ongoing cost that will never stop as long the language exists.
Interesting issue I hit recently: If you combine unified condition expressions with unions, you could get code like this:
union X of String
let foo: X = "bar"
if foo
... is String { ???.size }
How to refer to the "casted" value of String inside the curly braces here?
With "classic" enums it was easy, because they had an extra syntactic layer of wrapping that my unions lack.
Is it just me, or are all the language options for building shaders some milquetoast reinterprations of 80s' C/C++?
A module system that doesn't account for generics is also called "a draft of a module system".
Despite ample evidence to the contrary, #Ilive (hmm, if I were also #evil, that would be a pallindrome as well as a visual collision)
Fascinating (if I do say so) #lispgames #gamejam #gamedev #retrospective on #itch_io
https://lispy-gopher-show.itch.io/lispmoo2/devlog/834615/princess-revisited
I am enormously happy with the
{ verb [ dobj [ prep iobj ] ] } x
language dynamic, and how it shares your #lisp #repl, and their concerns are just... Different so they don't collide.
I guess I get my #languageDesign friends a little better now.
Thoughts?
I've been thinking about permacomputing and digital conservation lately.
Every language compiler or interpreter should contain a full guide + specification to its language. How else are people going to use it if they find it on some disk in a couple decades time...
For the most part #Swift doesn't seem to have too many half-baked ideas in the language, but property wrappers definitely fall into the category of poorly thought out features. In particular, making wrappers composable is some thing that the language designers did not think through. This post by Noah Gilmore describes one technique. But it is fiddly.
https://noahgilmore.com/blog/nesting-property-wrappers/ #programming #languagedesign
#languagedesign #compiler quandary.
My favourite language (to be named) aims to be friendly and correct.
So, it allows a comment containing text that looks like literals. And it allows literal text containing all the markers that would normally mean a comment.
And it allows comments within comments, to allow a block of code to be commented out easily, even when it contains all the above.
Q: Have I gone too far?
I've spent more time getting this correct in pass 1 than anything else!
Reflective Towers of Interpreters
https://blog.sigplan.org/2021/08/12/reflective-towers-of-interpreters/
@mcc @hikari I have been doing language design for 15 years, and I still love talking about every aspect involved in it.
But I literally never want to deal with Rust people ever again.
Seemingly their only motivation in language design discussions is to defend Rust and to assume that anyone not cloning Rust's approach 100% verbatim just means he/she haven't been crabsplained enough.
As I think I mentioned recently (in my LispNYC talk?), McCarthy, at least by time of the standards, wanted there not to be any single dialect of Lisp that claimed the unadorned name Lisp. So there was syntactic room for dialects to compete.
There was discussion in design of EuLisp, an influence on ISLISP, about layers. EuLisp had 3 layers, I think. That layering didn't make it into ISLISP. I don't recall it being proposed for CL, though I might be forgetting at this point.
It was never clear to me, though I was not part of their community, whether the underlying language was a subset of, or just an implementation language for, the surface language.
My only very concrete memory was that someone wanted some transform T such that L1 = T(L0) to be applied at some point. Maybe this came up in the CL macros committee. I was pretty adamant that CL did not need macro "hygiene" like Scheme has.
The urgency of that is due to Scheme's choice of being what I called a Lisp1 (the names Lisp1 and Lisp2 in this context are no relation to versioning you referred to in reference to Lisp 1.5, but are abstract categories that address how many variable namespaces are in the language).
Various characteristics of CL, including but not limited to being a Lisp2, helped insulate it from name collisions in the macro system, so the macro system didn't need to have formal hygiene like in scheme, or so I claimed.
I don't think the CL community wanted to change its macro system. Indeed I'd go so far as to say the easy writing of macros is strongly correlated with the rapid success of the family of Lisps that became CL. Breaking something that was succeeding wildly seemed ill-advised, or at least that's how I perceived things. Obviously this was a community discussion.
The fact of a package system rather than a lexical module system was also relevant to this. It's a subtle issue, but important. Scheme is a language defined by its programs' texts. In Common Lisp, the language semantics is on objects.
Speaking only very approximately here for brevity, the compiler operates on lists made of conses and symbols, not text made of parens and alphanumeric tokens. So there need be no source text in CL. Gensyms can easily be created by macros for variable names in CL, something you can't do in Scheme without help of a hygienic macro system to do rewrites for you. The use of gensyms further insulates macros from name collisions that would be routine in scheme if it had a CL-style macro system.
I think the way in which the package system, the macro system, and the namespacing form a kind of ecosystem is ill-understood by most users, even though they're comfortable with the effect. It only matters to understand it if someone suggests changing one of these without changing the others.
It's sometimes hard for someone from the Scheme community to understand why macro hygiene isn't needed in CL if they don't get how packages are fundamentally different than lexical namespacing. So they propose isolated changes thinking it will solve a problem that isn't really there. It's also hard for them to get why CL users see such proposals as adding complexity rather than removing it.
Goung one level meta, it's worth observing that there are not good language features and bad language features, but good within a context or bad within a context. People sometimes get a feature they like and want to impose it on other languages, not seeing that's harmonious in some and not others.
Context matters in assessing abstracts like goodness. It's the harmony of the ecosystem that matters. Adopting change, even well-meaning change, can be very disruptive, sometimes causing unexpected cascade effects (and community cost in both dollars and happiness) one doesn't realize will be needed.
Some of this is covered in the paper Gabriel and I wrote called Technical Issues of Separation in Function Cells and Value Cells. We were tasked by X3J13 to write such a thing because we disagreed on how things should go and between the two of us would likely hit all the issues. :) If you read carefully, you'll see the paper is a kind of debate between us where one would say something and the other would say "yeah, but.." and inject counterpoint. (This is the origin of the namespacing categories Lisp1/Lisp2. Also, the original paper, titled just Issues of Separation in Function Cells and Value Cells was longer and more X3J13-specific. We tightened it up for journal publication, mostly at Gabriel's insistence, as "Technical Issues..." to emphasize some dropped politics.) https://www.nhplace.com/kent/Papers/Technical-Issues.html
I may be far afield and I have no idea if that answers your question or just creates more questions, so I'll stop. :)
Bob Nystrom, the author of Crafting Interpreters, wrote an article I really enjoyed: "Does Go Have Subtyping?"
He communicates pretty advanced concepts in a very easy to read article. The intersection of programming language theory and implementation tradeoffs is really interesting! He helps bridging the gap between the terms used in these areas.
https://journal.stuffwithstuff.com/2023/10/19/does-go-have-subtyping/
Are there other programming languages with something like the Rust borrow checker?
https://arxiv.org/pdf/2005.09028v2.pdf
#DSL for #DSL . good reading if you like DSLs and #languagedesign
How would you imagine language support for index sequences in C++? Each option would introduce a pack of indexes the size of the existing `pack` #cpp #cplusplus #languagedesign
The thing that I love about working on programming languages and runtimes is that "oh no, everything is broken forever" bugs usually have a reproduction that is like... 3-4 lines of code and use a single core and no external inputs
Okay, adding #swift Automatic Reference Counting (ARC), which is thread safe, to what my dream #programmingLanguage should have. Compared to Swift, #nim their preferred implementation, ORC, doesn’t leak memory when dealing with reference cycles. Although I don’t know if #nimlang their implementation it thread safe, as well.
Maybe my theoretical #languageDesign should warn, but leak upon reference cycles. It seems to make the implementation easier.