-
Kronos: A Foundation Model for the Language of Financial Markets
Paper • 2508.02739 • Published • 19 -
Agent Learning via Early Experience
Paper • 2510.08558 • Published • 277 -
Context Engineering 2.0: The Context of Context Engineering
Paper • 2510.26493 • Published • 9 -
Kosmos: An AI Scientist for Autonomous Discovery
Paper • 2511.02824 • Published • 7
Collections
Discover the best community collections!
Collections including paper arxiv:2602.20021
-
Depth Anything V2
Paper • 2406.09414 • Published • 103 -
An Image is Worth More Than 16x16 Patches: Exploring Transformers on Individual Pixels
Paper • 2406.09415 • Published • 51 -
Physics3D: Learning Physical Properties of 3D Gaussians via Video Diffusion
Paper • 2406.04338 • Published • 39 -
SAM 2: Segment Anything in Images and Videos
Paper • 2408.00714 • Published • 122
-
LLMs + Persona-Plug = Personalized LLMs
Paper • 2409.11901 • Published • 35 -
To CoT or not to CoT? Chain-of-thought helps mainly on math and symbolic reasoning
Paper • 2409.12183 • Published • 39 -
Chain of Thought Empowers Transformers to Solve Inherently Serial Problems
Paper • 2402.12875 • Published • 13 -
TPI-LLM: Serving 70B-scale LLMs Efficiently on Low-resource Edge Devices
Paper • 2410.00531 • Published • 33
-
Kronos: A Foundation Model for the Language of Financial Markets
Paper • 2508.02739 • Published • 19 -
Agent Learning via Early Experience
Paper • 2510.08558 • Published • 277 -
Context Engineering 2.0: The Context of Context Engineering
Paper • 2510.26493 • Published • 9 -
Kosmos: An AI Scientist for Autonomous Discovery
Paper • 2511.02824 • Published • 7
-
Depth Anything V2
Paper • 2406.09414 • Published • 103 -
An Image is Worth More Than 16x16 Patches: Exploring Transformers on Individual Pixels
Paper • 2406.09415 • Published • 51 -
Physics3D: Learning Physical Properties of 3D Gaussians via Video Diffusion
Paper • 2406.04338 • Published • 39 -
SAM 2: Segment Anything in Images and Videos
Paper • 2408.00714 • Published • 122
-
LLMs + Persona-Plug = Personalized LLMs
Paper • 2409.11901 • Published • 35 -
To CoT or not to CoT? Chain-of-thought helps mainly on math and symbolic reasoning
Paper • 2409.12183 • Published • 39 -
Chain of Thought Empowers Transformers to Solve Inherently Serial Problems
Paper • 2402.12875 • Published • 13 -
TPI-LLM: Serving 70B-scale LLMs Efficiently on Low-resource Edge Devices
Paper • 2410.00531 • Published • 33