Em Adespoton

  • 0 Posts
  • 184 Comments
Joined 2 years ago
cake
Cake day: June 4th, 2023

help-circle













  • Interesting story.

    I started using Objective C in 1994 on NeXTcubes, and later NeXTstations.

    For simpler, one-off projects, it was great; also great for its ability to make any existing C library or function (or even block of asm) an object that played nicely with all the rest. And every API was just another set of objects! Discovery was easy.

    It wasn’t until it came to maintenance of complex codebases that it became a problem. There’s a reason things like NSurlHandler stuck around right into modern macOS — replacing objects like THAT had implications all up and down the dependency chain. Essentially, it became Apple’s equivalent of DLL Hell.

    It was also the last language that I thought could be almost all things to all people; after that, I realized that specialized languages that performed really well in a single context were a much better way to go.





  • The term “Artificial Intelligence” has been bandied around for over 50 years to mean all sorts of things.

    These days, all sorts of machine learning are generally classified as AI.

    But I used to work with Cyc and expert systems back in the 90s, and those were considered AI back then, even though they often weren’t trying to mimic human thought.

    For that matter, the use of Lisp in the 1970s to perform recursive logic was considered AI all by itself.

    So while you may personally prefer a more restrictive definition, just as many were up in arms with “hacker” being co-opted to refer to people doing digital burglary, AI as the term is used by the English speaking world encompasses generative and diffusive creation models and also other less human-centric computing models that rely on machine learning principles.