Al-Khwarizmi a day ago

They make LLMs play a very abstract game that rewards them points from answering the same as the other, and punishes them from answering differently, and LLMs tend to converge to an answer. From that to "social conventions" there is a long, long stretch. The paper lacks a baseline - wouldn't much simpler (non-LLM) systems also exhibit the same property? Is it that surprising that systems that are clones of each other (because they didn't even try "mixed societies" of different LLMs) agree when you give them points for agreeing?

Maybe I'm missing something but in my view this is pure hype and no substance. And note that I'm far from an LLM skeptic and I wouldn't rule out at all that current LLMs could develop social conventions, but this simulation doesn't really show that convincingly.

tbrownaw a day ago

As obviously silly as this is, could it actually be useful for getting the observed phenomena acknowledged by people who might just tune out is presented with the underlying math that makes it individually?

How much overlap is there between people who think llms are magic and people who don't approve of applying math to groups of people? And how many in the overlap have positions where their opinions have outsized effects?

lostpilot a day ago

Crazy - this is saying agents develop their own biases and culture through their engagement with one another even if the individual agents are unbiased.

  • Animats a day ago

    That may be reading too much into this behavior. Watch this video of metronomes self-synchronizing.[1] That's a pervasive phenomenon. Anything with similar oscillation frequency and coupling will do this. (Including polling systems with fixed retry intervals.)

    Are you sure that's not just this effect?

    [1] https://www.youtube.com/watch?v=Aaxw4zbULMs

    • drdaeman a day ago

      Yes, but aren't cultures essentially the same way - people grouped together getting influenced by each other actions and ending up learning from each other (introducing bias into individual agents), doing things together, appreciating similar stuff, talking in particular ways and so on? AFAIK, "culture" essentially means "stuff that goes in in this particular space and time".

      • Animats a day ago

        The article hypothesized leaders and followers. That's not necessary. Drift towards the mean, plus some noise, is sufficient.

        • lou1306 16 hours ago

          I briefly worked on synchronized applause (well, a toy discretised cellular automata-like model of such), and the individual agents don't even need to know the mean, or to receive continuous feedback (which is what happens with synchronised metronomes).

          As long as they can infer a collective pace from listening to the general loudness in the room, they can do very basic adjustments to their own clapping rhythm=phase and they will get in sync.

          And applause is just a stand-in for any locally periodic behaviour with regular signals (claps) which can be aggregated (more claps = louder).

amelius a day ago

I wonder when we will see LLMs being used to test economic theories.

  • laughingcurve a day ago

    Already exists in the current literature

    • falcor84 20 hours ago

      Interesting, any particular such paper(s) that you'd recommend?

    • tmpz22 a day ago

      And much of the current writing

dgfitz a day ago

It’s funny how we seem to be on this treadmill of “tech that uses GPUs to crunch data” starting with the Bitcoin thing, moving to NFTs, now LLMs.

Wonder what’s next.

ggm a day ago

So how many systems aside from Grok started to espouse Afrikaaner propaganda, and how many systems aside from Meta started to be holocaust deniers?

or, are the walled gardens working to our advantage here?

  • otabdeveloper4 19 hours ago

    These things are trained on 4chan archives among other things, so all you need is a system prompt.

th0ma5 a day ago

Oh I thought this was going to be about the cult like in group signaling of LLM advocates, but this is a thing imagining language patterns as a society instead of language patterns of a society being a bias that the models have.

  • A4ET8a8uTh0_v2 a day ago

    This sounds dismissive, but it is interesting in two more obvious ways:

    1. The interaction appears to mimic human interaction online ( trendsetter vs follower ) 2. It shows something some of us have been suggesting for a while: pathways for massive online manipulation campaigns