Artwork

A tartalmat a LessWrong biztosítja. Az összes podcast-tartalmat, beleértve az epizódokat, grafikákat és podcast-leírásokat, közvetlenül a LessWrong vagy a podcast platform partnere tölti fel és biztosítja. Ha úgy gondolja, hogy valaki az Ön engedélye nélkül használja fel a szerzői joggal védett művét, kövesse az itt leírt folyamatot https://hu.player.fm/legal.
Player FM - Podcast alkalmazás
Lépjen offline állapotba az Player FM alkalmazással!

“Condensation” by abramdemski

30:29
 
Megosztás
 

Manage episode 519011554 series 3364758
A tartalmat a LessWrong biztosítja. Az összes podcast-tartalmat, beleértve az epizódokat, grafikákat és podcast-leírásokat, közvetlenül a LessWrong vagy a podcast platform partnere tölti fel és biztosítja. Ha úgy gondolja, hogy valaki az Ön engedélye nélkül használja fel a szerzői joggal védett művét, kövesse az itt leírt folyamatot https://hu.player.fm/legal.
Condensation: a theory of concepts is a model of concept-formation by Sam Eisenstat. Its goals and methods resemble John Wentworth's natural abstractions/natural latents research.[1] Both theories seek to provide a clear picture of how to posit latent variables, such that once someone has understood the theory, they'll say "yep, I see now, that's how latent variables work!".
The goal of this post is to popularize Sam's theory and to give my own perspective on it; however, it will not be a full explanation of the math. For technical details, I suggest reading Sam's paper.
Brief Summary
Shannon's information theory focuses on the question of how to encode information when you have to encode everything. You get to design the coding scheme, but the information you'll have to encode is unknown (and you have some subjective probability distribution over what it will be). Your objective is to minimize the total expected code-length.
Algorithmic information theory similarly focuses on minimizing the total code-length, but it uses a "more objective" distribution (a universal algorithmic distribution), and a fixed coding scheme (some programming language). This allows it to talk about the minimum code-length of specific data (talking about particulars rather than average [...]
---
Outline:
(00:45) Brief Summary
(02:35) Shannons Information Theory
(07:21) Universal Codes
(11:13) Condensation
(12:52) Universal Data-Structure?
(15:30) Well-Organized Notebooks
(18:18) Random Variables
(18:54) Givens
(19:50) Underlying Space
(20:33) Latents
(21:21) Contributions
(21:39) Top
(22:24) Bottoms
(22:55) Score
(24:29) Perfect Condensation
(25:52) Interpretability Solved?
(26:38) Condensation isnt as tight an abstraction as information theory.
(27:40) Condensation isnt a very good model of cognition.
(29:46) Much work to be done!
The original text contained 15 footnotes which were omitted from this narration.
---
First published:
November 9th, 2025
Source:
https://www.lesswrong.com/posts/BstHXPgQyfeNnLjjp/condensation
---
Narrated by TYPE III AUDIO.
  continue reading

672 epizódok

Artwork
iconMegosztás
 
Manage episode 519011554 series 3364758
A tartalmat a LessWrong biztosítja. Az összes podcast-tartalmat, beleértve az epizódokat, grafikákat és podcast-leírásokat, közvetlenül a LessWrong vagy a podcast platform partnere tölti fel és biztosítja. Ha úgy gondolja, hogy valaki az Ön engedélye nélkül használja fel a szerzői joggal védett művét, kövesse az itt leírt folyamatot https://hu.player.fm/legal.
Condensation: a theory of concepts is a model of concept-formation by Sam Eisenstat. Its goals and methods resemble John Wentworth's natural abstractions/natural latents research.[1] Both theories seek to provide a clear picture of how to posit latent variables, such that once someone has understood the theory, they'll say "yep, I see now, that's how latent variables work!".
The goal of this post is to popularize Sam's theory and to give my own perspective on it; however, it will not be a full explanation of the math. For technical details, I suggest reading Sam's paper.
Brief Summary
Shannon's information theory focuses on the question of how to encode information when you have to encode everything. You get to design the coding scheme, but the information you'll have to encode is unknown (and you have some subjective probability distribution over what it will be). Your objective is to minimize the total expected code-length.
Algorithmic information theory similarly focuses on minimizing the total code-length, but it uses a "more objective" distribution (a universal algorithmic distribution), and a fixed coding scheme (some programming language). This allows it to talk about the minimum code-length of specific data (talking about particulars rather than average [...]
---
Outline:
(00:45) Brief Summary
(02:35) Shannons Information Theory
(07:21) Universal Codes
(11:13) Condensation
(12:52) Universal Data-Structure?
(15:30) Well-Organized Notebooks
(18:18) Random Variables
(18:54) Givens
(19:50) Underlying Space
(20:33) Latents
(21:21) Contributions
(21:39) Top
(22:24) Bottoms
(22:55) Score
(24:29) Perfect Condensation
(25:52) Interpretability Solved?
(26:38) Condensation isnt as tight an abstraction as information theory.
(27:40) Condensation isnt a very good model of cognition.
(29:46) Much work to be done!
The original text contained 15 footnotes which were omitted from this narration.
---
First published:
November 9th, 2025
Source:
https://www.lesswrong.com/posts/BstHXPgQyfeNnLjjp/condensation
---
Narrated by TYPE III AUDIO.
  continue reading

672 epizódok

Tất cả các tập

×
 
Loading …

Üdvözlünk a Player FM-nél!

A Player FM lejátszó az internetet böngészi a kiváló minőségű podcastok után, hogy ön élvezhesse azokat. Ez a legjobb podcast-alkalmazás, Androidon, iPhone-on és a weben is működik. Jelentkezzen be az feliratkozások szinkronizálásához az eszközök között.

 

Gyors referencia kézikönyv

Hallgassa ezt a műsort, miközben felfedezi
Lejátszás