Ruminations on excerpts of research papers, blogs and books

ML/DL/AI subfields: present and future

Long gone are the days were ML students used to code up RNNs and CNNs in order to utilise their very own models on tasks. Transformers put an end to that, why ? Scalability. Transformers only outperform the other architectures when scaled, and hence the average individual person could only stand in awe as millions of dollars, thousands of GPUs and >terabytes of storage was used in order to train foundation models. This paradigm shift is reminiscing of technology which is beyond the individual. However, as any field matures, one can find niches to lodge our efforts into, and hence I have gathered a few possible paths here. These maybe obsolete or solved in the upcoming years, but I shall not remove, but only extend this list, to track the growth of this field:

Hosted on streams.place.