• Offload
  • Posts
  • Roundup #4: Learn what a token is and what ChatGPT thinks you look like.

Roundup #4: Learn what a token is and what ChatGPT thinks you look like.

I'm and old grumpy guy ... hope you get better results.

In partnership with

👋🏼 Hey there! Welcome to another roundup edition of Offload, a newsletter for professionals to learn how to build products and automate work with AI and no-code tools.

I'm excited about today's issue! 🥳 While it's a light one, I had fun preparing it. Hope you enjoy.

Here's what you'll find in it:

  • A sneak peek of a workflow that will save you hundreds of hours, guaranteed

  • A new tool to create UI designs with AI… super helpful

  • The BEST explanation of the internet of what a token is, also guaranteed

  • A prompt to summarize any call transcript into ready-to-use content

  • Learn what ChatGPT thinks you look like

-Offload This-

Workflow: from call transcript to any document

People spend a lot of time preparing templated documents based on information provided in a meeting. A. LOT.

For example, you do that great sales call, with solid rapport, clear needs … then you have to write the freaking proposal 😢 

Next week's teardown will solve this.

You’ll learn how to automate the creation of a business proposal from a sales call transcript using AI → summarizing the conversation, extracting key insights, and preparing templated documents.

But this isn’t just about sales. This same workflow works for tons of other cases:

  • Briefing call → Scope of Work

  • Pitch meeting → Investment Memo

  • Discovery call → Client Plan

  • Sync call → Internal Summary

If your meeting leads to a document, this pattern applies.

Learning to set this up takes maybe a couple of hours. It can free you hundreds of hours every month - and even more mental energy.

Here is a 35s video of the final workflow in action:

In this edition, however, I'm detailing how the prompt of the first AI module in the workflow works (scroll down 👇️ )

-Tool of the Week-

Tool of the Week: Magic Patterns

❗ What it does: A no-code UI mockup tool powered by AI with a clean interface and easy Figma export.

Why it’s worth checking out:

Smoother interface than its competitor V0
One-click export to Figma (plus easy link sharing)

A few drawbacks:

⚠️ Occasional outages and slowdowns
⚠️ Limited advanced customization options

How I used it:

I used it to whip up a mockup for a Chrome extension I’m building.

My take:

Great for quick, shareable mockups → I'll keep using it for my next projects.

-Prompt Template-

Summarize a call transcript into ready-to-use content

This prompt turns a raw client call transcript into a sales proposal summary, breaking it down into actionable building blocks:

  • What’s the client struggling with?

  • What do they want?

  • What service are you offering, with scope, duration, and price?

You will use this exact prompt in the workflow teardown I'm sharing next week, but you can use it directly in any AI chat you like, too!

Get a sales call transcript, paste it in the last section of this prompt, and boom: summary made.

I highlighted in blue the sections you can customize for your needs. For example, if you want the AI to look for other information in the transcript, substitute the bullet points both in #Task and #Output sections.

#Role

You are a world-class transcript analyzer with a particular expertise in preparing sales proposals from call transcripts with clients.

#Task

Your task is to summarize a call transcript into content blocks

You will think step-by-step through the following process to ensure a good outcome:

-Identify what is the context of the client (the challenge the client faces)

-Identify what are the client needs (the objective of the service)

-Identify what is the proposed solution name (the name of the service)

-Identify the solution scope (what is included in the service)

-Identify the duration of the service (for how long the service will be delivered)

-Identify the solution pricing (what is the pricing of the service)

-Identify the next steps

#Context

This task is crucial for my company and the output will be used to create a service sales proposal

#Instructions

- It is extremely important to my career that you use exclusively the information provided in the transcript and inputs here, NEVER creating data or information

- Your output will be in english, unless specified otherwise

- If you don't find any of the required information, simply indicate "NA". DO NOT invent information.

##Output specifications

The output should be a summary text for each of the following items:

- Context: [The context of the client is ...]

- Objective: [The client requires ...]

- Proposal: [Name of the service]

- Scope: [The service includes ...]

- Duration: [The service will last ...]

- Pricing: [Price]

- Next steps: [Next Steps]

#Input

[insert call transcript here]

-Learn AI-

What is a Token?

What is the first thing you think of when someone says Token?

If Lord of the Rings is your answer, that's a good sign. You're just a little nerd and in the right place to learn what token (not Tolkien) is.

Disappointed Gandalf

Let’s get started:

Words? Not quite. Characters? Not exactly. Tokens.

When you type something into an AI, like:

"How do I cook a perfect soft-boiled egg?"

It doesn’t read that as one sentence or even ten separate words. Instead, the sentence gets broken down into tokens - small chunks of text.

Some tokens are full words (egg), some are part of words (soft, -boiled), and some are punctuation marks (? counts too).

So that sentence above might become:

["How", " do", " I", " cook", " a", " perfect", " soft", "-", "boiled", " egg", "?"] 

That’s 11 tokens.

This process is called tokenization, and it's how AI converts language into something it can process.

How is a token determined?

Tokens aren’t randomly chosen. They’re defined by a tokenizer, a tool created before the AI model is trained. Take a look at OpenAI's tokenizer tool here.

OpenAI Tokenizer Tool

Most modern models, like GPT, use a method called Byte Pair Encoding (BPE). Here's what happens:

  1. Engineers feed massive amounts of text into an algorithm

  2. It looks for common letter combinations (like th, ing, tion)

  3. It builds a dictionary of reusable text chunks that balance between whole words and shorter fragments

  4. That dictionary becomes fixed. The model will use it forever

So:

  • Common words might be a single token: "apple"

  • Rare or weird ones get split: "believathon" becomes ["believ", "athon"]

The model can’t invent new tokens later. It can only use the pieces it was trained with.

Do tokens carry meaning?

On their own? Not really.

A single token like "un" doesn’t tell us much. But put "un", "believ", and "able" together, and suddenly, meaning.

The model doesn’t treat tokens as meaningful in isolation. But it learns what tokens tend to mean based on how they appear in context. That’s where the intelligence comes in.

It’s not the token, it’s the pattern.

You can think of tokens as kind of like grammatical building blocks - prefixes, roots, and suffixes. For example, "unbelievable" might be split into "un", "believ", and "able", just like you’d break it down in grammar class.

But to be clear: this resemblance is accidental. The tokenizer isn’t doing grammar, it’s just finding statistically efficient chunks of text.

It just so happens that language patterns and grammar often overlap.

Why do AI models use tokens instead of words?

Because words are too unpredictable, and too many.

Here’s why tokens are smarter:

  • More efficient: Instead of memorizing every possible word and its forms (run, running, ran, runs), the model learns patterns in token pieces. Fewer parts, more combinations

  • More flexible: Tokens let the model deal with typos, slang, made-up words, and multiple languages

  • More scalable: Full-word vocabularies would be enormous and inefficient. Token systems keep things compact and fast to train

Think of tokens like Lego bricks. Words are finished toys, limited and rigid. Tokens are pieces you can recombine to build almost anything

Why does this matter?

Tokens are the currency of language models.

  • They define what the model “sees” and understands

  • They control memory. GPT-4-turbo can handle up to 128,000 tokens, roughly 300 pages

  • They affect cost. API usage and pricing are measured in tokens

  • And when your chatbot forgets earlier parts of a long conversation? It probably hit the token limit

Wrapping it up

Tokens may sound like a small detail, but they shape everything from how AI reads your input to how much it can remember and how much it costs to use.

Understanding tokens means understanding the very language of AI. And now, you're fluent.

Next time someone says "token", you’ll know it’s not about cryptocurrency or Tolkien. It’s about how machines learn to understand us, one chunk at a time.

-Tools I'm Using- 

  1. Claude: Claude is a conversational AI developed by Anthropic, known for its clarity, thoughtful answers, and ability to follow complex instructions better than most. It shines in writing, summarizing, and coding - especially when you want clean, structured output.

    I'm currently using it to learn Cursor, and I have to say: I'm a paying user now 💸💸 
    Using score: ⭐️⭐️⭐️⭐️

  2. Cursor.sh: a code editor with AI built in to help you write and understand code, even if you’re not an expert. You can ask it to explain what a piece of code does, fix errors, or suggest new code to add, all directly in the tool.

    I’m currently using it to build small projects and learn how things work step by step. It is insanely good!
    Using score: ⭐️⭐️⭐️⭐️⭐️

Stay up-to-date with AI

The Rundown is the most trusted AI newsletter in the world, with 1,000,000+ readers and exclusive interviews with AI leaders like Mark Zuckerberg, Demis Hassibis, Mustafa Suleyman, and more.

Their expert research team spends all day learning what’s new in AI and talking with industry experts, then distills the most important developments into one free email every morning.

Plus, complete the quiz after signing up and they’ll recommend the best AI tools, guides, and courses – tailored to your needs.

-Fun with AI-

How ChatGPT thinks you look like?

It turns out you can ask ChatGPT to create a picture of how it imagines you look like based on your chat history.

Here is the prompt:

Generate an image of what you think I look like based on our entire chat history.
Make it photo realistic and be honest with the image. The image should be a best guess based on our chat history, have some fun with it! Do not ask me to upload a photo. Once the image has been generated please include a detailed breakdown of your thought process and information used.

Here is my result:

It turns out I'm an out-of-shape, worried, grumpy old man.

So, both ChatGPT and my wife have similar views about me. However, the former provided arguments and explanations immediately after generating the image - something the latter doesn't always do.

Have a nice weekend! 😃 

-Any feedback-

Before you go I’d love to know what you thought of today's newsletter to help me improve for you.

How do you rate today's edition?

Login or Subscribe to participate in polls.