- 
	
	
	
When can transformers reason with abstract symbols?
Paper • 2310.09753 • Published • 4 - 
	
	
	
In-Context Pretraining: Language Modeling Beyond Document Boundaries
Paper • 2310.10638 • Published • 30 - 
	
	
	
Reward-Augmented Decoding: Efficient Controlled Text Generation With a Unidirectional Reward Model
Paper • 2310.09520 • Published • 12 - 
	
	
	
Connecting Large Language Models with Evolutionary Algorithms Yields Powerful Prompt Optimizers
Paper • 2309.08532 • Published • 53 
Lu Yu
VoladorLuYu
		AI & ML interests
Neuro-Symbolic, Large Language Models, Graph Machine Learning
		Recent Activity
						upvoted 
								a
								paper
							
						12 days ago
						
					
						
						
						Every Attention Matters: An Efficient Hybrid Architecture for
  Long-Context Reasoning
						
						updated 
								a collection
							
						about 1 month ago
						
					AutoAgent
						
						liked
								a model
							
						about 1 month ago
						
					
						
						
						
						inclusionAI/Ring-mini-linear-2.0
						Organizations
None yet