Blog

Latest Posts

Reflections On 2025 As An Entrepreneur In Residence, AI

Frederick Lowe - December 16, 2025

It's an understatement to say "AI" (really, Transformer-based LLMs) are actively remaking the world. Yes, other forms of AI exist. Diffusion models and RL still matter, and depending on application domain, can matter more than Transformers.

Consensus Editing for Scaled AI Publishing

Frederick Lowe - Dec 8, 2025

Over the last two years, I built an industry-killer: an AI-powered platform for high-quality article generation. The "Eureka!" moment wasn't discovering that Generative AI can be used to do this. It was realizing that the REAL problem is creating editin...

(A)I Know What You Did Last Summer

Frederick Lowe - October 13, 2025

In November 2024, the CEO of a global digital advertising company called me about a problem potentially costing his firm millions: lost Requests For Proposal (RFP).

Prompt Chaining For Incremental Improvement of AI Outputs

Frederick Lowe - July 21, 2025

Complex prompts often produce good results up to a point, after which additional refinements can reduce output quality. This happens because additional instructions create competing attention patterns in the transformer's processing.

Structuring AI Output for Production Systems

Frederick Lowe - Mar 10, 2025

For the past two years, I've served as the Lead Architect (and frequent bench engineer) on several applied projects integrating various Large Language Models (LLMs).

Understanding Transformer Attention

Frederick Lowe - Jan 8, 2025

Most of what folks call "AI" now revolves around Generative Pre-trained Transformers (GPTs). A key concept in understanding how GPTs work is to understand the concept of "Attention".

Word Salad Emitter

Frederick Lowe - Oct 14, 2024

Clarity in verbal and written communication is among the most critical skills we learn as human beings. But using complex vocabulary, while efficient in expert audiences with shared knowledge, is often the wrong approach.