s20e01: Better than Average; How People Work

159 days ago 6 views Things That Caught My Attention newsletter.danhon.com

s20e01: Better than Average; How People Work

0.0 Context Setting

Friday, 8 August, 2025 on the cusp of a heat wave in Portland, Oregon.

Well. It’s been a while, hasn’t it. I will just say that gestures events occurred, those events weren’t particularly fun for anyone involved, and with any luck they’ll abate.

0.1 Hallway Track

While there are no Hallway Tracks coming up, I have three planned in my head that I’m super excited about. Hopefully something coming soon.

1.0 Some Things That Caught My Attention

1.1 Better than Average

My friend Naomi Alderman posted an observation a few weeks back that went a bit like this:

  • LLMs / generative AI / token prediction machines are trained on a mass of data
  • They are essentially designed to produce the most likely next token in a series of tokens
  • They work because of the sheer amount of data they’re trained on
  • So they are very good at producing average text

Average here doesn’t mean “bad”. It means likely. Likely doesn’t mean true, but it also doesn’t necessarily mean untrue, either.

The average output may also be new output, in that it’s the product and confluence of influences (i.e. prompts) and associations (i.e.