Skip to main content
SAGE
Search form
  • 00:00

    [MUSIC PLAYING][An Introduction to Information Theory]

  • 00:09

    LAV VARSHNEY: My name is Lav Varshney.[Lav Varshney, PhD, Assistant Professor,University of Illinois, Urbana Champaign]I'm an assistant professor at the University of Illinoisat Urbana-Champaign in the Department of Electricaland Computer Engineering.And I study the foundations of communications and computingbut also looking at applications of some of these ideasto understand biological systems, social systems,

  • 00:30

    LAV VARSHNEY [continued]: and algorithmic systems in various forms.So information theory is the mathematical theoryof communication.It was developed, originally, by Claude Shannon in the 1940sto kind of understand the fundamental limitsof communication and the presence of noise.[Shannon, CE. (1948).A Mathematical Theory of Communication.] Tryingto understand the nature of information and how much one

  • 00:50

    LAV VARSHNEY [continued]: can compress data but also how much one can communicateand what is the information rate-- the number of bits--that you can actually transmit reliably over noisy channels.And after the 1940s, there was a lot of interestin applying these theories, not justin technology, as his motivation was,

  • 01:11

    LAV VARSHNEY [continued]: but also to understand biological systemsand social systems.And there was actually a burst of activity in those days,but it kind of fell off.And now that we have a lot more computation and a lot moredata from a lot of these other fields,there's been a resurgence of interestto use information theoretic techniquesto understand not just how systems work

  • 01:33

    LAV VARSHNEY [continued]: but why they are the way are.Are they operating at the limits of what is possible?I've been working in information theory for quite a whilenow since my graduate school days.But in fact, I even took a class in information theorywhen I was an undergrad.And I got interested in it in kind of a strange manner.

  • 01:55

    LAV VARSHNEY [continued]: I was taking some history of engineering classesas an undergrad, and the professor, Ronald Klein,was doing some research on the early history of informationtheory.So I actually learned the history of the fieldbefore I learned the field itself.And I just found it very beautiful and very clean,and I got seduced by that beauty I suppose.

  • 02:16

    LAV VARSHNEY [continued]: [How can information theory be applied?]The nice thing with the mathematical theory,like information theory, is that it is very widely applicable.But one does really need to understand the systemthat one is trying to analyze.Shannon, himself, actually, was good about this.He drew a block diagram which described

  • 02:37

    LAV VARSHNEY [continued]: the transmitter, the communication channel,the receiver, and so on.And he really cast his theory to fall into the frameworkof the block diagram.When we look at other systems, I thinkit's very good to start by constructinga block diagram of what we're trying to do and thengo from there.[What are some recent advances of applications of information

  • 02:58

    LAV VARSHNEY [continued]: theory?]There is a large corpora of languagethat we're starting to analyze using information theory whichis very exciting.But also, looking at the flow of informationamong people and in societies, I thinkthat's becoming very interesting nowthat we're able to capture thingsin terms of people's conversations,

  • 03:20

    LAV VARSHNEY [continued]: their mobility, their interactions with others.[What skills do you need to understand and applyinformation theory?]To understand information theory and apply it in new situationslike we might to get a better handle on social life,I think it is good to have a strong foundation

  • 03:41

    LAV VARSHNEY [continued]: in probability theory since it isa mathematical statistical theory.But also, it's very important to reallyhave an understanding of the system itself.So strong training in the social sciencesis actually, perhaps, even more importantthan the probabilistic ones.I think there are now a lot of mathematical techniques

  • 04:04

    LAV VARSHNEY [continued]: from machine learning that are useful for learningprobabilistic models.And I think that's really useful for, then,further applying information theoretic ideasto understand the fundamental limits and, perhaps,optimal designs of social systemsthen asking whether, we as people, actuallyapproach those fundamental limits.I think a lot of the modern programming languages

  • 04:26

    LAV VARSHNEY [continued]: are appropriate.In fact, some of these techniquesare fairly straightforward.It's just about handling the large data setsand thinking about the theoretical frameworks.[Can you give an example of when you used information theoryand what you learned in the process?]To describe an interesting application of informationtheory, let me talk about an example in neurosciencebecause I think it's very straightforward to understand.

  • 04:49

    LAV VARSHNEY [continued]: So there's been a lot of experiments recently,over the last, maybe, 15 years to measurethe distribution of synaptic strength in the brain.And in fact, a lot of synapses in the brainare actually very noisy, and this is confusing.You would think that nature wouldhave made better synapses.But what we asserted was an optimization approach

  • 05:11

    LAV VARSHNEY [continued]: to biology, thinking that the brain is actuallyoptimized for something.And in particular, we asked, is the brain optimalfor information storage capacity per unit volume?And volume is a good cost because you don'twant your head too big, right?That's a good measure of metabolic energy and all.And so we applied information theoretic approachesto experimental data and made a prediction

  • 05:34

    LAV VARSHNEY [continued]: on what the distribution of synaptic strength in the brainshould be if it were optimized for this principle,and electrophysiology experiments show pretty muchthe same distribution.So it's suggestive that maybe the brainis optimized for this principle of storage capacity per unitvolume.And one could imagine doing the same style of analysisin a variety of other settings whether it's

  • 05:55

    LAV VARSHNEY [continued]: neuroscience or psychophysics or social structure.So one kind of builds up some of these same ideas takinga hypothesis from optimality theoryand then testing whether that actually plays out.[What recommendations do you have for someone that wantsto learn more about information theory?]So I think before one gets into this area,

  • 06:17

    LAV VARSHNEY [continued]: I think there's techniques, but there's also problems.And I think thinking about the problems is reallythe best place to start.So what is it about the social worldthat one wants to understand?And then pick the appropriate tools,whether their computational or mathematical,to actually answer those questions.[What three pieces of advice would you give a student

  • 06:38

    LAV VARSHNEY [continued]: entering computational social science?]Some pieces of advice I have for a studentgoing into computational social sciences--to think about the tools that you haveand what questions it can answer.So when the microscope was invented,you could see small things, or whenthe telescope was invented, you could see really far awaythings.So now that we can capture an instrument social phenomena,

  • 06:59

    LAV VARSHNEY [continued]: what is it that you can see, and what can you do with that?And then secondly, can we think about a structureto analyze what's going on--a hypothesis, a framework, a kind of a mathematical theoryto explain what's going on.And also, I think having fun is really important.

  • 07:19

    LAV VARSHNEY [continued]: This area is so interesting, right?You're studying who we are as peopleand how we interact with others, and that's just so exciting.[Are there any advances in the field you are excited about?]I think there's a lot of new techniques usingnetwork science in combination with unsupervised learning that

  • 07:40

    LAV VARSHNEY [continued]: are very interesting.So rather than having labeled data,can you have data that's completely unlabeledand then understand the clusteringand the kind of connections between different concepts?In terms of new areas for people to gointo in this computational social science topic,

  • 08:00

    LAV VARSHNEY [continued]: my training is actually as an engineer,and so I often think of societal impact.So what can we actually change from the insightsfrom social science?And I think health and well-being are actuallyvery important.So if we can understand how the social structure impactsour health, and then what actions canwe take to improve that?

  • 08:21

    LAV VARSHNEY [continued]: We can think about food and other aspects of cultureand how those impact us, how social capital influenceshealth.So I think that area is really exciting to me.[What key readings would you recommend on learninginformation theory?]To get a good introduction to information theory,

  • 08:42

    LAV VARSHNEY [continued]: I think there is a book by John Pierce which is called Signals,Symbols, and Noise.I think that's a very nice kind of general introductionto the field.Some people even like reading Shannon's original 1948 paper.It's actually kind of fun to read.And, of course, now there's plentyof textbooks on the subject.[FURTHER READING][Cover, TM & Thomas, JA. (2005).Elements of information theory.

  • 09:02

    LAV VARSHNEY [continued]: Hoboken, NJ.Wiley.][MacKay, DJC. (2003).Information theory, inference and learning algorithms.Cambridge, United Kingdom.Cambridge University Press][Pierce, J. (1980).An introduction theory-- Signals, symbols and noise.New York, NY.Dover Publications][Shannon, CE. (1948).A mathematical theory of communication.The Bell System Technical Journal, 27(3), 379 423.][Varshney, L., Sjostrom, PL, Chklovskii, DB. (2006).Optimal information storage in noisysynapses under Resource Constraints.Neuron, 52(3), 409 423.][Yeung, RW. (2008).Information theory and network coding.New York, NY.Springer.][MUSIC PLAYING]

Video Info

Publisher: SAGE Publications Ltd

Publication Year: 2019

Video Type:Tutorial

Methods: Big data, Computational modelling

Keywords: communication research; information theory; mathematical theory of communication; neuroscience; probability theory; programming and scripting languages; social science; Social science research; Synapses ... Show More

Segment Info

Segment Num.: 1

Persons Discussed:

Events Discussed:

Keywords:

Abstract

Lav Varshney, PhD, Assistant Professor in the Department of Electrical and Computer Engineering at the University of Illinois, discusses information theory, or the mathematical theory of communication, including its application, recent advances, skills needed to understand and apply it, research examples, as well as recommendations and advice for those considering entering the field.

Looks like you do not have access to this content.

An Introduction to Information Theory

Lav Varshney, PhD, Assistant Professor in the Department of Electrical and Computer Engineering at the University of Illinois, discusses information theory, or the mathematical theory of communication, including its application, recent advances, skills needed to understand and apply it, research examples, as well as recommendations and advice for those considering entering the field.

Copy and paste the following HTML into your website