-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathindex.html
More file actions
178 lines (154 loc) · 9.42 KB
/
index.html
File metadata and controls
178 lines (154 loc) · 9.42 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>AriadneMem: Threading the Maze of Lifelong Memory for LLM Agents</title>
<script src="https://cdn.tailwindcss.com"></script>
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/6.0.0/css/all.min.css">
<style>
body { font-family: 'Inter', -apple-system, sans-serif; background-color: #ffffff; color: #333; }
.section-title { font-size: 2rem; font-weight: 800; margin-bottom: 2rem; border-bottom: 4px solid #3b82f6; display: inline-block; padding-bottom: 4px; }
.content-card { background: #fff; border-radius: 20px; box-shadow: 0 10px 30px rgba(0,0,0,0.05); padding: 1.5rem; border: 1px solid #f1f5f9; margin-bottom: 2.5rem; }
.img-full-width { width: 100%; height: auto; border-radius: 12px; display: block; margin: 0 auto; transition: transform 0.2s ease; }
.img-full-width:hover { transform: scale(1.01); }
/* 超大 Logo 样式:原图比例,不裁剪 */
.project-logo-extra-large {
width: 90%;
max-width: 550px;
height: auto;
object-fit: contain;
margin: 40px auto;
display: block;
}
.video-container { width: 100%; max-width: 1000px; margin: 2rem auto 4rem auto; border-radius: 24px; overflow: hidden; box-shadow: 0 25px 50px -12px rgba(0,0,0,0.15); border: 1px solid #f1f5f9; }
.author-box { font-weight: 600; color: #1a202c; line-height: 1.4; }
.author-box span { white-space: nowrap; }
.affiliation-box { color: #718096; font-size: 0.95rem; line-height: 1.6; }
.email-box { font-family: monospace; color: #4a5568; margin-top: 10px; font-size: 0.9rem; }
/* Abstract 特殊排版 */
.abstract-text { font-family: 'Georgia', serif; font-size: 1.15rem; line-height: 1.8; color: #2d3748; text-align: justify; }
pre { background: #1e293b; color: #e2e8f0; padding: 20px; border-radius: 12px; overflow-x: auto; font-size: 0.9rem; }
.btn-custom { background-color: #24292f; color: white; border-radius: 25px; padding: 12px 28px; transition: 0.3s; font-weight: 600; display: inline-flex; align-items: center; }
.btn-custom:hover { background-color: #000; transform: translateY(-2px); }
</style>
</head>
<body class="antialiased">
<header class="max-w-6xl mx-auto px-6 pt-20 pb-10 text-center">
<h1 class="text-4xl md:text-6xl font-extrabold mb-8 text-gray-900 leading-tight">
AriadneMem: Threading the Maze of <br><span class="text-blue-600">Lifelong Memory</span> for LLM Agents
</h1>
<img src="image/0.png" alt="AriadneMem Logo" class="project-logo-extra-large">
<div class="author-box flex flex-wrap justify-center gap-x-5 gap-y-2 text-lg mb-6 max-w-5xl mx-auto">
<span><b>Wenhui Zhu</b><sup>1*</sup></span>
<span><b>Xiwen Chen</b><sup>2*</sup></span>
<span><b>Zhipeng Wang</b><sup>3*</sup></span>
<span><b>Jingjing Wang</b><sup>4</sup></span>
<span><b>Xuanzhao Dong</b><sup>1</sup></span>
<span><b>Minzhou Huang</b><sup>5</sup></span>
<span><b>Rui Cai</b><sup>6</sup></span>
<span><b>Hejian Sang</b><sup>7</sup></span>
<span><b>Hao Wang</b><sup>4</sup></span>
<span><b>Peijie Qiu</b><sup>8</sup></span>
<span><b>Yueyue Deng</b><sup>9</sup></span>
<span><b>Prayag Tiwari</b><sup>10</sup></span>
<span><b>Brendan Hogan Rappazzo</b><sup>2</sup></span>
<span><b>Yalin Wang</b><sup>1</sup></span>
</div>
<div class="affiliation-box max-w-5xl mx-auto mb-4">
<sup>1</sup>Arizona State University, <sup>2</sup>Morgan Stanley, <sup>3</sup>Rice University, <sup>4</sup>Clemson University, <sup>5</sup>Northwestern University, <br class="hidden md:block">
<sup>6</sup>UC Davis, <sup>7</sup>Iowa State University, <sup>8</sup>Washington University in St. Louis, <sup>9</sup>Columbia University, <sup>10</sup>Halmstad University
</div>
<div class="email-box">
wzhu59@asu.edu, xiwen.chen@morganstanley.com
</div>
<div class="mt-4 mb-10 text-xs italic text-gray-400">* Equal Contribution</div>
<div class="flex flex-wrap justify-center gap-4 mb-16">
<a href="https://arxiv.org/abs/2603.03290" class="btn-custom"><i class="fas fa-file-pdf mr-2"></i>Paper</a>
<a href="https://github.com/LLM-VLM-GSL/AriadneMem" class="btn-custom"><i class="fab fa-github mr-2"></i>Code</a>
</div>
<div class="video-container">
<video class="w-full" controls autoplay muted loop playsinline>
<source src="image/123.mp4" type="video/mp4">
</video>
<div class="bg-gray-50 py-3 text-gray-500 text-sm font-medium border-t italic text-center">
Demo: Real-time Lifelong Memory Retrieval & Reasoning
</div>
</div>
</header>
<main class="max-w-5xl mx-auto px-6">
<section class="mb-16">
<div class="content-card">
<img src="image/1.png" class="img-full-width" alt="Teaser Results">
</div>
</section>
<section class="mb-20 bg-slate-50 p-8 md:p-14 rounded-3xl border border-slate-200 shadow-inner">
<h2 class="section-title">Abstract</h2>
<div class="abstract-text">
Long-horizon LLM agents require memory systems that remain accurate under fixed context budgets.
However, existing systems struggle with two persistent challenges in long-term dialogue:
(i) <b>disconnected evidence</b>, where multi-hop answers require linking facts distributed across time,
and (ii) <b>state updates</b>, where evolving information (e.g., schedule changes) creates conflicts with older static logs.
We propose <b>AriadneMem</b>, a structured memory system that addresses these failure modes via a decoupled two-phase pipeline.
In the <b>offline construction phase</b>, AriadneMem employs <i>entropy-aware gating</i> to filter noise and low-information message before LLM extraction and applies <i>conflict-aware coarsening</i> to merge static duplicates while preserving state transitions as temporal edges.
In the <b>online reasoning phase</b>, rather than relying on expensive iterative planning, AriadneMem executes <i>algorithmic bridge discovery</i> to reconstruct missing logical paths between retrieved facts, followed by <i>single-call topology-aware synthesis</i>.
On LoCoMo experiments with GPT-4o, AriadneMem improves <b>Multi-Hop F1 by 15.2%</b> and <b>Average F1 by 9.0%</b> over strong baselines.
Crucially, by offloading reasoning to the graph layer, AriadneMem reduces <b>total runtime by 77.8%</b> using only <b>497</b> context tokens.
</div>
</section>
<section class="mb-20">
<h2 class="section-title">Methodology</h2>
<div class="content-card">
<img src="image/2.png" class="img-full-width" alt="Framework Architecture">
</div>
</section>
<section class="mb-20">
<h2 class="section-title">Case Study</h2>
<div class="content-card">
<img src="image/3.png" class="img-full-width" alt="Qualitative Analysis">
</div>
</section>
<section class="mb-20">
<h2 class="section-title">Experimental Results</h2>
<div class="content-card">
<img src="image/4.png" class="img-full-width" alt="Main Results">
</div>
<div class="content-card">
<img src="image/5.png" class="img-full-width" alt="Efficiency Analysis">
</div>
</section>
<section class="mb-20">
<h2 class="section-title">Ablation Study</h2>
<div class="content-card">
<img src="image/6.png" class="img-full-width" alt="Ablation">
</div>
</section>
<section class="mb-24">
<h2 class="section-title">BibTeX</h2>
<div class="relative group">
<pre id="bibtex-code">
@misc{zhu2026ariadnememthreadingmazelifelong,
title={AriadneMem: Threading the Maze of Lifelong Memory for LLM Agents},
author={Wenhui Zhu and Xiwen Chen and Zhipeng Wang and Jingjing Wang and Xuanzhao Dong and Minzhou Huang and Rui Cai and Hejian Sang and Hao Wang and Peijie Qiu and Yueyue Deng and Prayag Tiwari and Brendan Hogan Rappazzo and Yalin Wang},
year={2026},
eprint={2603.03290},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2603.03290},
}</pre>
<button onclick="copyBib()" class="absolute top-4 right-4 bg-gray-700 text-white px-3 py-1 rounded text-xs hover:bg-gray-600 transition">Copy</button>
</div>
</section>
</main>
<footer class="pb-16 text-center text-gray-400 text-sm">
<p>© 2026 AriadneMem Project Team</p>
</footer>
<script>
function copyBib() {
const text = document.getElementById('bibtex-code').innerText;
navigator.clipboard.writeText(text);
alert('BibTeX copied to clipboard!');
}
</script>
</body>
</html>