It's an understatement to say "AI" (really, Transformer-based LLMs) are actively remaking the world. Yes, other forms of AI exist. Diffusion models and RL still matter, and depending on application domain, can matter more than Transformers.
Over the last two years, I built an industry-killer: an AI-powered platform for high-quality article generation. The "Eureka!" moment wasn't discovering that Generative AI can be used to do this. It was realizing that the REAL problem is creating editin...
In November 2024, the CEO of a global digital advertising company called me about a problem potentially costing his firm millions: lost Requests For Proposal (RFP).
Complex prompts often produce good results up to a point, after which additional refinements can reduce output quality. This happens because additional instructions create competing attention patterns in the transformer's processing.
For the past two years, I've served as the Lead Architect (and frequent bench engineer) on several applied projects integrating various Large Language Models (LLMs).
Most of what folks call "AI" now revolves around Generative Pre-trained Transformers (GPTs). A key concept in understanding how GPTs work is to understand the concept of "Attention".
Clarity in verbal and written communication is among the most critical skills we learn as human beings. But using complex vocabulary, while efficient in expert audiences with shared knowledge, is often the wrong approach.