[ot][spam][draft/notes] braindead learning: algorithms research
Once I took a class in data structures and algorithms. In the class, we went through various common structures and learned them and used them, with say a lecture validating each topic with boardwork. Similar to other classes. I don’t remember well now, but I’m guessing about half of the algorithms were new to me, and the others I was already using. I’d imagine I had come up with some independently. CS was still a new field, so hobbyists, especially young nerds, often had a jump on things. I imagine it doesn’t have to be a new field for that to be the case. Relearning things, I often have to repeat them ad nauseum, as if I have dementia, to retain much. I’d like to learn to _discover_ algorithms and data structures, like I used to, not simply parrot what’s mainstream.
When concepts seem very small, it’s helpful to describe _how_ to discover algorithms or data structures, and in general this seems like a more useful thing to teach others.
I’m thinking a major component was skill at manipulating smaller, normative concepts and structures, in complex ways. With practice, one can imagine more and more simultaneous pointers with ease. Memory competitions for other knowledgesets show that anybody can learn to hold many items in their mind at once, with practice. Another part is how it is useful to use things like pointers and for loops. There are also various transforms one can perform on structures and algorithms, to mutate them into something with different properties of speed, storage use, and other forms of complexity.
The goal of this thread is to help me think of things to do to learn generally useful algorithms skills, such that I might discover normative things in my own, When you can discover it on your own, it means you can also discover more relevant or complex things, rather than being stuck with the toolbox a teacher gives you.
—— In order to do that, it’s helpful to search for algorithmic and structural components to practice mentally manipulating. The basic core of creative algorithm design to me seems the ability to imagine all of the features that programming languages offer, used in unbounded combination. Like practicing musical scales, it could make sense to imagine each feature used in combination with each other, in each way it might be used. There are of course some combinations that are far, far more useful than others: but we want this to be easy and quick to identify. —- My situation is a little unique in that I tend to get “triggered”, quotes unneeded perhaps, by components and combinations that have been crux in projects I have worked on at certain times. For me, practicing the triggering components and combinations is possibly most difficult and also most useful. Luckily for developing useful notes, these are both numerous and generally very useful for multiple things. Such as graphs, I have a big inhibition around anything to do with graphs.
———- When I try to come up with an algorithm or data structure, I’m usually focused on some property of complexity, something I am optimizing, even if out of habit. If there’s no immediate reason to optimize for anything, I used to optimize for speed, and this was what most people usually did. When doing this, I roughly assumed there were no bounds: that anything could be optimized as much as needed, if sufficient design effort was invested. Modern research has, in my opinion, validated this, finding impressively powerful heuristics when culture is valuing the concept strongly enough. Math can seem to disagree sometimes. This doesn’t worry me. There are a handful of properties and component combination groups useful for normal optimization goals. Some of these I used to be very familiar with; others I have never learned. I left CS as a freshman or sophomore undergrad to study wilderness survival. When navigating the different options available, as when navigating anything else, one holds a sense of how useful, how much return, there would be to spend time investigating each possibility. We don’t want to unreasonably explore possibilities exhaustively, but we do want to have enough ease and skill to find things that are increasingly useful. So sometimes we decide some options are more interesting than other options.
—— Common properties for design optimization I’ve seen include: - speed of cpu execution - memory usage - disk usage - network data size - frequency or count of reads or writes - needed dependency libraries or platforms There are others. A basic concept when considering design optimization properties is the reason for them. The reason for selecting these is usually the experience of users of the system: for the system to be useful and/or enjoyable, commonly one wants it to be responsive. Another common optimization reason is price, such that the project is cheaper to build.
On the topic of user experience, another concept is that systems can be designed to measure the user experience and change their approach dynamically. There used to be domains in which this was commonplace, but the practice doesn’t seem to have survived well, as far as I can see. A simple example is OS thrashing. Some user-focused heuristics here could prevent the entire system halt users experience when they try to process more data than they have RAM to hold. —- Another design axiom I tend to hold is that optimization properties are exchangeable. That is, that if something is very heavy cpu-usage-wise, but uses very little memory, then there is some transform that can be applied to it to use a lot of memory, and very little cpu. Practicing finding these various transforms seems like it could be a useful skill-building exercise.
participants (1)
-
Undescribed Horrific Abuse, One Victim & Survivor of Many