Generative AI for Genealogy – Part III

How many skills can a tiny model handle?

Good question. I have no idea.

But here’s what I do know:

  • With 23 skills, the prompt uses 463 tokens.
  • Llama 3.2 1B has a 2048‑token context window.
  • So in theory, I could scale to around 100 skills, depending on description length.

If I ever reach that point, I may need a hierarchical system with skills grouped into categories, and sub‑skills beneath them. But that’s a problem for Future Dave.

Coming up next: Normalisation

In the next post, I’ll dig into the surprisingly messy world of normalisation; how to turn unpredictable human questions into something an LLM can reliably understand.

Spoiler: it involves synonyms, spelling variants, and the occasional linguistic eye‑roll.

Next part: Generative AI for Genealogy – Part IV

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *