This leads to the problem that scraps with the same logic, but different variable names result in different hashes. Unison solves this by erasing all names when “compiling”.
Perhaps it would make sense to separate metadata from content.
So, essentially, a scrap only contains its logic, and the hash only addresses that logic. Metadata is associated with scraps at the scrap map level, and adds stuff like parameter names, documentation, and other info useful for IDEs. This means that these things can change without the actual scrap hash changing.
I’m not entirely sure if this is desirable, but it could even be used for extending syntax at a later date in a backwards-compatible way. Essentially, a scrap would be stored as its flat, desugared version, and contain metadata that tells the scrapscript tooling which syntax this should resugar to.
I think I agree with @lim, following the path blazed by Unison is probably a good one to follow. Metadata independent of the logic.
I think the higher level metadata question will be a really important one to make scraps and scrapyard work. Most of the time when shopping around for libraries, I don’t look for specific functions (though it does happen), I look for collections of related functions. IMO we’ll need to come up with something analogous to a library. Probably just a meta scrap that points to a bunch of other scraps. Call it a scrapheap or something and attach as much metadata to it as you can.
Maybe not feasible, but in the future, maybe you could have some type of semantic indexing (e.g. a vector db) on scraps in the scrapyard. Instead of looking for libraries, you could have the tooling use that to hunt for applicable functions for whatever you’re working on?
I think the “library” abstraction is purely for metadata for discovery on related functions. It’s really hard to do discovery when you can’t attach a meaningful name to the function (and most names are not great).