Skip to main content

Part of the book series: Lecture Notes in Mathematics ((LNMECOLE,volume 1957))

  • 1762 Accesses

Abstract

Free entropy was defined by Voiculescu as a generalization of classical entropy to the non-commutative context. There are several definitions of free entropy; we shall concentrate on two of them. The first is the so-called microstates entropy that measures a volume of matrices with empirical distribution approximating a given law. The second, called the microstates-free entropy, is defined via a non-commutative version of Fisher information. The classical analog of these definitions is, on one hand, the definition of the entropy of a measure ? as the volume of points whose empirical distribution approximates ?, and, on the other hand, the well-known entropy\(\int {\frac{{d\mu }}{{dx}}} \log \frac{{d\mu }}{{dx}}dx\). In this classical setting, Sanov’s theorem shows that these two entropies are equal. The free analog statement is still open but we shall give in this section bounds to compare the microstates and the microstates-free entropies. The ideas come from [37, 55, 56] but we shall try to simplify the proof to hopefully make it more accessible to non-probabilists (the original proof uses Malliavin calculus but we shall here give an elementary version of the few properties of Malliavin calculus we need). In the following, we consider only laws of self-adjoint variables (i.e., \(A_i^* = A_i {\rm for }1 \le i \le m\)). We do not loose generality since any operator can be decomposed as the sum of two self-adjoint operators.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 79.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alice Guionnet .

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Guionnet, A. (2009). Free entropy. In: Large Random Matrices: Lectures on Macroscopic Asymptotics. Lecture Notes in Mathematics(), vol 1957. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-69897-5_19

Download citation

Publish with us

Policies and ethics