-
tree-of-thought-llm
[NeurIPS 2023] Tree of Thoughts: Deliberate Problem Solving with Large Language Models
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
The transformer was developed in 2017 and it powers all modern LLMs. If you're familiar with Daniel Kahneman's work from Thinking Fast and Slow, you could easily summarize LLMs as excellent System 1 thinking: our fast, automatic, unconscious responses (e.g. autocomplete). I'd argue that we're one development (similar to the transformer) away from creating System 2 thinking: deliberate and strategic thinking. In fact, with merely GPT-4 and some clever architectures, researchers have developed chain-of-thought prompting and, more recently, tree-of-thoughts reasoning. While external to the LLM architecture, embedding these concepts into a LLM could very likely solve the creation of System 2 thinking and produce the first real AGI. Adding more modalities (e.g. audio, images, video, topography, etc.) will simply add more nuance in the weights and biases of a complete system.