Koding Harness: notes on agent-first engineering
  • About
Categories
All (19)
agents (4)
conference (3)
deep-learning (4)
goals (1)
iclr (1)
llm (15)
news (1)
papers (7)
performance (1)
reflections (1)
security (2)
software-engineering (4)
tokenizer (2)
transfer-learning (1)
Hub-and-spoke diagram: a central engineer node connected by amber dashed lines to six purple agent nodes

From Programmer to Harness Engineer

Codex didn’t make me a faster programmer. It changed what programming means.

Feb 23, 2026
Pipeline diagram showing THREAD → TURN → ITEM primitives with event stream and approval gate

Designing an AI-Native SDLC with Codex App Server

Thread, turn, item: the three primitives that turn an AI assistant into engineering infrastructure.

Feb 19, 2026
Cerebras defect-tolerance simulator and Codex-Spark real-time coding workflow

Meta-Engineering: Letting Codex-Spark Simulates the Silicon that it runs on

Prompting a wafer-scale chip simulator at 1,000 tokens per second

Feb 14, 2026

CodeClash: Training AI Agents for Long-Horizon Programming

Expanding code post-training beyond unit tests to competitive, long-horizon programming tasks with CodeClash’s new training arenas

Jan 7, 2026

LLM Inference Deep Dive: Metrics, Batching & GPU Optimization

Technical exploration of LLM inference metrics, batching strategies, and GPU optimization with TensorRT-LLM - from latency metrics to in-flight batching

Dec 9, 2024

NeurIPS 2023: Trends, Talks, and Technologies

I don’t think the above quote applies to me, but nevertheless wanted to share my first NeurIPS experience. To start with it was an exhilarating whirlwind, packed with a…
Dec 17, 2023

The Future of AI is Modular: Insights from ModCon 2023

Today, the landscape of AI development is on the brink of a transformative change. I had the opportunity to witness this first-hand at the ModCon conference in San…
Dec 4, 2023

Watermark Security in Language Models

In this brief exploration, I delve into a groundbreaking concept: a unique watermarking technique for large language models (LLMs), as detailed in A Watermark for Large…
Nov 20, 2023

Concepts from Operating Systems That Found Their Way in LLMs

Diving into the intricacies of technology often uncovers unexpected parallels. Recently, I’ve been struck by how foundational computer operating system concepts are making…
Oct 22, 2023

LLMs for Compiler Optimization: A Deep Dive

The application of Large Language Models (LLMs) in optimizing LLVM assembly for code size is emerging, but is it truly shaping a new reality or just a theoretical…
Sep 19, 2023

Exploring Ways to Extend Context Length in Transformers

Diving deep into the intricacies of large language models (LLMs), one hurdle quickly becomes evident: the context length limitation. While many recognize its implications…
Aug 7, 2023

Reflections from an Evening with NVIDIA’s Jensen Huang

Listening to a fireside chat with Jensen Huang, the dynamic Founder and CEO of NVIDIA, alongside three pioneering startup founders- Dr. Jaroslaw “Jarek” Kutylowski of DeepL…
Jul 4, 2023

Navigating the AI Landscape: My Takeaways from Sam Altman’s Discussion

At a recent event at my alma mater, TU Munich featuring Sam Altman, CEO of OpenAI, attendees had the opportunity to delve into a range of topics. From AI regulation, the…
May 27, 2023

StarCoder: A Revolutionary Code Generation Model

Are you tired of spending hours writing repetitive code? Do you find yourself searching through documentation and Stack Overflow for code snippets? Look no further!…
May 22, 2023

A Deep Dive into the LLM Bootcamp Experience: Revolutionizing AI-Powered Applications

The world of technology is currently undergoing a monumental transformation, and my participation in the LLM Bootcamp was nothing short of enlightening from my perspective.…
Apr 26, 2023

A Philosophical Dive into Deep Learning: My Experience at the NYU Conference

Last week, I was fortunate enough to attend the captivating Philosophy of Deep Learning Conference at New York University. This event united experts from diverse fields to…
Mar 28, 2023

Exploring the Latest Advancements in Transfer Learning: A Summary of ICLR’23 Transfer Learning-Related Papers

The International Conference on Learning Representations (ICLR) is one of the top conferences in the field of machine learning, and this year’s conference (ICLR 23) features…
Jan 23, 2023

From Alps to NLP: A 2022 Recap of Exploration and Growth

As I look back on the year 2022, I can’t help but feel a sense of nostalgia and wonder.
Dec 26, 2022

Herzlich Wilkommen!

Welcome to my new blog and my new home on the internet!
Dec 25, 2022
No matching items