北京通用人工智能研究院BIGAI

科研成果

How to Synthesize Text Data without Model Collapse?

Lossless Acceleration of Ultra Long Sequence Generation

Are the Values of LLMs Structurally Aligned with Humans?A Causal Perspective

Look Both Ways and No Sink: Converting LLMs into Text Encoders without Training

ReflectEvo: Improving Meta Introspection of Small LLMs by Learning Self-Reflection

MindDial: Enhancing Conversational Agents with Theory-of Mind for Common Ground Alignment and Negotiation