Happy Moon Landing Day

 

Forty-nine years ago. This tribute is reposted from last year:

Advertisements

Book of note: “The Personalized Diet” by Eran Segal and Eran Elinav

I have blogged earlier about the book by neuroscientist Sandra Aamodt and have discussed there in passing the pioneering work by Weizmann Institute scientists Eran Segal and Eran Elinav on the individual microbiome (our “gut bacteria population”) and how it affects blood sugar levels. Now the duo has teamed up with editor Eve Adamson, and together they have put out a popularized book:

I am familiar with some of the original papers in top scientific journals—the book is of course much more readable, and the authors and editors have done a good job of presenting their work in lay language while preserving the broad strokes of their work.

The bottom line of their research is this: each of us carries a whole ecosystem of bacteria in our intestines, which help us digest and absorb food. The specific mix of bacteria varies between individuals, and hence so do our responses to different foods. While weight gain/loss is best seen as an outcome—one aspect of overall health—glycemic response, the changes in blood sugar levels after a meal (“postprandial glucose response”) are sufficiently rapid that they can be monitored in real time (e.g. with a continuous glucose monitor) and correlated with what the person ate (logged in a smartphone app). Doing this for thousands of people is a big-data project par excellence, and this is how computer scientist Segal teamed up with gastroenterologist Elinav.

But this isn’t where it ends. Gut bacteria populations can of course be obtained from stool samples, and subjected to analysis—another aspect of the massive big-data puzzle. Moreover, some of what they infer from the data can be checked in an animal model—for instance, certain gut bacteria can be administered to sterile mice and their weight gain (or lack thereof) in response to certain food mixtures tested on a much shorter time scale than would be possible in slow, relatively large, and long-lived mammals like us.

The duo brought different, complementary perspectives to the problem, not just scientifically but personally. Elinav always loved to take machines apart and see how they fit together (fittingly, he did his military service aboard a submarine), then became fascinated with living organisms. He ended up studying medicine, then specializing in internal medicine. During his residency, he was exposed to the human suffering caused by “metabolic syndrome” (the term given to the combination of severe obesity, adult-onset diabetes, fatty liver, hyperlipidemia, and the complications thereof). He realized that they spent all their time as doctors dealing with the consequences and complications rather than with the root cause.

Segal, on the other hand, was an avid long-distance runner in his spare time. He started experimenting with different nutritional approaches to improve his endurance as a runner, assisted in this pursuit by his wife, a clinical dietitian. As he dove deeper into this and observed diets of fellow runners, it became increasingly clear to him that there was no one-size-fits-all, and that recommendations that were held to be gospel truth (or Torah from Sinai, in our case) were, in fact, counterproductive for some. Why do some runners who eat dates before a run become energized and others exhausted? Who do some do best with carb-loading, and indeed thrive on high-carb diets, while others quickly pack on the pounds and suffer from low energy?

Segal was already involved in the computational study of the human genome at the time and then started reading about the emergent field of study of the microbiome. One thing led to another, a mutual acquaintance put Segal and Elinav in touch with each other, and together they embarked on the collaboration that eventually morphed into the personalized nutrition project.

One factor that facilitated their research was that rapid, reliable, and minimally-invasive blood glucose monitoring technology has become relatively inexpensive. And here some of their first surprises came. Anybody who has followed a Gary Taubes-type diet, or who is trying to manage diabetes, is aware of the ‘glycemic index’ (GI) of foods—the increase in blood sugar levels caused by eating a given amount of the food, compared to the same amount of pure glucose (for which GI=100 by definition). But how uniform are these values really?

Segal and Elinav found that the GI for some foods (e.g., bananas) differed very little between their test subjects (say, 60-65), while others (e.g., apples) were all over the place (40-90). Moreover, the variation was not random but correlated with the person.

One would expect glycemic response to go up more or less linearly with the amount of the food consumed was a given. They found that this is indeed true for smaller amounts, but at some point saturation sets in as the body manufactures more insulin, and the glucose response levels off. (This, of course, does not mean you can just eat ten times as much: the insulin will cause the excess energy to be stored as fat!)

More surprising, however, was that higher fat content in the meal on average caused a minor decrease in glycemic response. For a nontrivial number of their participants, eating toast with butter or olive oil actually did less glycemic harm than eating the toast on its own.

Now trying to keep blood sugar levels on a more even keel has two major benefits. In the short term, yo-yoing blood sugar levels lead to a reduction in energy, a feeling of exhaustion as the body pumps out insulin in response to a sugar spike and blood sugar dips. As for the long term: Segal and Elinav found across their sample that glycemic response after habitual meals is strongly correlated with BMI. Keeping blood sugar levels on a more even keel turns out to be a win-win on all counts.

And here’s the catch—”thanks” to our microbiome, glycemic response is highly individual. Segal himself ‘spikes’ after eating rice, while Elinav does not. One person spikes after ice cream, while another does not—and the same person who spikes after an evening snack of ice cream can safely have chocolate instead, go figure.

This addresses a seeming paradox. It’s not that diets don’t work—in fact, many do for some people, though long-term compliance can be an issue—it’s that there is no diet that will work for everyone, or even for most people.

So the next step, then, was to have a computer analyze the data for some of the participants in depth, and have it plan out a personalized diet that would keep blood sugar levels as steady as possible for that patient. Guess what? Yup, you guessed it.

Now some people might be discouraged by the idea of carrying around a blood sugar monitor for two weeks and carefully logging every meal (and physical activity). But once a large enough dataset has been established, and correlated to analyses of the gut flora composition in all the test persons, it becomes possible to predict glycemic responses to different foods with reasonable accuracy based on a bacterial population analysis of stool samples. A startup company named DayTwo is offering to do exactly that. [Full disclosure: I have no financial interest in DayTwo or in any of Drs. Segal and Elinav’s ventures.]

We are at the dawn of a major revolution in healthcare—a shift away from a paradigm of statistical averages to one of detailed monitoring of individual patients. Call it ‘personalized medicine’ or any other buzzword: it does seem poised to radically change healthcare and individual health outcomes for the better.

 

Saturday musical delight: Well-Tempered Clavier in MuseScore animation

Via YouTube channel “gerubach”, which has been presenting “scrolling score” youtube videos of musical compositions for many years, I stumbled upon the following gem of a playlist:

All of Book I of Bach’s Well-Tempered Clavier is being rendered there in MuseScore animation: as you hear the audio, not only do you see the score on screen (two systems at a time) and a pointer scrolling across the notes being played, but at the bottom of the screen, you see the notes currently sounding displayed on a piano keyboard.

Especially in combination with YouTube’s ability to play back videos at reduced speed without altering the pitch, this is a marvelous self-tutoring tool for keyboard playing as well as music theory.

The audio is taken from the performance by pianist (and former competitive weight lifter!) Kimiko Ishizaka [official website]. The MuseScore team could legally do so as the (IMHO excellent) performance was released in the public domain (!)

The onetime child prodigy pre-funds her recordings through Kickstarter campaigns (most recently, she ran one for a “Libre”recording of Bach’s The Art of Fugue), then releases them online under PD or Creative Commons licenses. The word “Libre” she uses has some currency in the open source software developer community: It refers to one of the two words in French (and other Romance languages) that correspond to the English “free”, namely libre (without restrictions, “free as in speech”) vs. gratis (without cost, “free as in lunch”).  She does not work gratis, but on what I have been calling a “massively distributed commissioning” model, and what is becoming known as a “threshold pledge” model: she sets a funding goal, solicits pledges from patrons on Kickstarter, and if her threshold is met, the work is performed and the money collected. For her last campaign, the threshold she set was 20,000 Euro, and the minimum pledge was 10 Euro — the price of an album at a CD store (remember those). Larger pledge amounts (20 Euro, 50 Euro, 100 Euro) get various extra goodies, such as live recordings from recent concerts, a physical CD of the music, and admission to one of three “meatspace” live concerts.

D. Jason Fleming has been talking a lot about the “Open Culture Movement”. I believe this is an interesting example, and may actually point a way toward the future for classical performers. The big losers here, of course, are the classical music labels, who in this model are about as profitable as illegal CD bootleggers….

 

RIP Jerry Pournelle, 1933-2017

The great Jerry Pournelle, political scientist, technological visionary, prolific science fiction writer (often in collaboration with Larry Niven), and computing pioneer all in one, just passed away after a brief respiratory illness. He had appeared at DragonCon only days earlier.

I’ve been following Chaos Manor on an off since it was first a print column in BYTE magazine, back in the Early Tertiary era of computing. The online version has a serious claim to being the world’s oldest blog.
Novels like “Fallen Angels” (with Niven & Flynn) or “The Mote in G-d’s Eye” (with Niven) would have made the reputation of a lesser man. But aside from being a prolific science fiction writer, he was also a compelling thinker and technological visionary. Even with half his brain zapped by radiation treatments, he could still out-think most soi-disant “intellectuals”. Pournelle suffered no fools intellectually, but by all accounts was a generous and solicitous human being in private.
Here is a taste of Jerry Pournelle in his own words. (He was, by the way, apparently the first writer to write a published novel entirely on a [then primitive and monstrously expensive] personal computer.)

HOW TO GET MY JOB

The question I get most often, both in mail and when I speak, is, “How do I get your job?” Usually it’s done a bit more politely, but sometimes it’s asked just that way. It’s generally phrased differently by computer audiences than by science fiction audiences, but both really want to know the same thing: how do you become an author?

I always give the same answer: it’s easy to be an author, whether of fiction or nonfiction, and it’s a pleasant profession. Fiction authors go about making speeches and signing books. Computer authors go to computer shows and then come home to open boxes of new equipment and software, and play with the new stuff until they tire of it. It’s nice work if you can get it.

The problem is that no one pays you to be an author.

To be an author, you must first be a writer; and while it’s easy to be an author, being a writer is hard work. Surprisingly, it may be only hard work; that is, while some people certainly have more talent for writing than others, everyone has some. The good news is that nearly anyone who wants to badly enough can make some kind of living at writing. The bad news is that wanting to badly enough means being willing to devote the time and work necessary to learn the trade.

The secret of becoming a writer is that you have to write. You have to write a lot. You also have to finish what you write, even though no one wants it yet. If you don’t learn to finish your work, no one will ever want to see it. The biggest mistake new writers make is carrying around copies of unfinished work to inflict on their friends.

I am sure it has been done with less, but you should be prepared to write and throw away a million words of finished material. By finished, I mean completed, done, ready to submit, and written as well as you know how at the time you wrote it. You may be ashamed of it later, but that’s another story.

The late Randall Garrett, one of the most prolific writers of the Golden Age of Science Fiction, used to have a number of rules, many of them scatological. One of them was that no professional writer ever got anything from formal courses in writing. I think he was wrong, in the sense that a good formal introduction to the rules of grammar and spelling can be extremely useful; but he had a point, which is that there aren’t any secrets to be learned from creative-writing courses. If the only way you can force yourself to write that million words of your best work is to take a class in creative writing or attend a writers’ workshop, by all means do it; but do it understanding that the good comes from the writing you do, not from the criticism or theory or technique taught in the class.

May his memory be blessed. The science fiction field and the blogosphere are truly a poorer place without him.

 

On Google and doublethink

The new Google slogan has been unveiled today (hat tip: Marina F.):

wip-google

For those who have been living under a rock: Google fired an employee for having the temerity to write a memo [draft archived here][full text here via Mark Perry at AEI] questioning the “diversity” (what I call “fauxversity”) and “affirmative action” (i.e., reverse discrimination) policies of the company. Said employee had earlier filed a labor grievance and is taking legal action. Now quite interestingly, here is an article in which four actual experts discuss the science underlying the memo, and basically find it unexceptional even though they do not all agree with the author on its implications. One of them, an evolutionary psychology professor at U. of New Mexico, has the money quote:

Here, I just want to take a step back from the memo controversy, to highlight a paradox at the heart of the ‘equality and diversity’ dogma that dominates American corporate life. The memo didn’t address this paradox directly, but I think it’s implicit in the author’s critique of Google’s diversity programs. This dogma relies on two core assumptions:
  • The human sexes and races have exactly the same minds, with precisely identical distributions of traits, aptitudes, interests, and motivations; therefore, any inequalities of outcome in hiring and promotion must be due to systemic sexism and racism;
  • The human sexes and races have such radically different minds, backgrounds, perspectives, and insights, that companies must increase their demographic diversity in order to be competitive; any lack of demographic diversity must be due to short-sighted management that favors groupthink.
The obvious problem is that these two core assumptions are diametrically opposed.
Let me explain. If different groups have minds that are precisely equivalent in every respect, then those minds are functionally interchangeable, and diversity would be irrelevant to corporate competitiveness. For example, take sex differences. The usual rationale for gender diversity in corporate teams is that a balanced, 50/50 sex ratio will keep a team from being dominated by either masculine or feminine styles of thinking, feeling, and communicating. Each sex will counter-balance the other’s quirks. (That makes sense to me, by the way, and is one reason why evolutionary psychologists often value gender diversity in research teams.) But if there are no sex differences in these psychological quirks, counter-balancing would be irrelevant. A 100% female team would function exactly the same as a 50/50 team, which would function the same as a 100% male team. If men are no different from women, then the sex ratio in a team doesn’t matter at any rational business level, and there is no reason to promote gender diversity as a competitive advantage.
Likewise, if the races are no different from each other, then the racial mix of a company can’t rationally matter to the company’s bottom line. The only reasons to value diversity would be at the levels of legal compliance with government regulations, public relations virtue-signalling, and deontological morality – not practical effectiveness. Legal, PR, and moral reasons can be good reasons for companies to do things. But corporate diversity was never justified to shareholders as a way to avoid lawsuits, PR blowback, or moral shame; it was justified as a competitive business necessity.
So, if the sexes and races don’t differ at all, and if psychological interchangeability is true, then there’s no practical business case for diversity.
On the other hand, if demographic diversity gives a company any competitive advantages, it must be because there are important sex differences and race differences in how human minds work and interact. For example, psychological variety must promote better decision-making within teams, projects, and divisions. Yet if minds differ across sexes and races enough to justify diversity as an instrumental business goal, then they must differ enough in some specific skills, interests, and motivations that hiring and promotion will sometimes produce unequal outcomes in some company roles. In other words, if demographic diversity yields any competitive advantages due to psychological differences between groups, then demographic equality of outcome cannot be achieved in all jobs and all levels within a company. At least, not without discriminatory practices such as affirmative action or demographic quotas.
So, psychological interchangeability makes diversity meaningless. But psychological differences make equal outcomes impossible. Equality or diversity. You can’t have both.
Weirdly, the same people who advocate for equality of outcome in every aspect of corporate life, also tend to advocate for diversity in every aspect of corporate life. They don’t even see the fundamentally irreconcilable assumptions behind this ‘equality and diversity’ dogma.

[“Jeb Kinnison” draws my attention to another article.] I just saw in an essay by Christina Hoff Sommers [see also video] on the AEI website that the National Science Foundation [!], as recently as 2007, sent around a questionnaire asking researchers to identify any research equipment in their lab building that was not accessible to women. In 2007. Seriously, I don’t know whether whoever came up with this “go find the crocodile milk” policy was gunning for a Nobel prize in Derpitude

 

derp seal

or trying to create sinecures for otherwise unemployable paper-pushers, or trying to divert bureaucratic energy into a Mobius loop that would minimize interference with serious decisions.

But on a more serious note: even before I saw the “paradox” remarks, I could not help being reminded of this passage in George Orwell’s “Nineteen Eighty-Four”. The protagonist, Winston Smith, retorts to his mentor turned inquisitor:

‘But the whole universe is outside us. Look at the stars! Some of them are a million light-years away. They are out of our reach for ever.’
‘What are the stars?’ said O’Brien indifferently. ‘They are bits of fire a few kilometres away. We could reach them if we wanted to. Or we could blot them out. The earth is the centre of the universe. The sun and the stars go round it.’
Winston made another convulsive movement. This time he did not say anything. O’Brien continued as though answering a spoken objection:
 ‘For certain purposes, of course, that is not true. When we navigate the ocean, or when we predict an eclipse, we often find it convenient to assume that the earth goes round the sun and that the stars are millions upon millions of kilometres away. But what of it? Do you suppose it is beyond us to produce a dual system of astronomy? The stars can be near or distant, according as we need them. Do you suppose our mathematicians are unequal to that? Have you forgotten doublethink?’ 

Precisely: doublethink. Thus it is possible, for example, that certain biological differences between men and women, or between ethnic groups, can be at the same time out of bounds for polite discussion,  yet entirely taken for granted in a medical setting. I remember when Jackie Mason in the early 1990s joked about wanting an [Ashkenazi] Jewish affirmative action quota for runners and basketball players: nowadays, that joke would probably get him fired at Google, while a sports doctor treating top athletes would just chuckle.

The root of evil here is twofold:

(1) the concept that even correct factual information might be harmful as it might encourage heresy [hmm, where have we heard that one before?];

(2) considering people as interchangeable members of collectives, rather than individuals. If one considers the abilities of a specific individual, then for the case at hand it does not matter whether the average aptitudes for X differ significantly between groups A and B, or not. (There is, in any case, much greater variability between individuals within a group than between groups.)

I would add:
(2b) overconfidence in numerical benchmarks by people without a real grasp of what they mean.

Outside the strict PC/AA context, it is the fallacy in (2b) which gives rise to such pathologies as politicians pushing for ever-higher HS graduation or college enrollment rates — because they only see “the percentage has gone up from X to Y” without seeing the underlying reality. They are much like the economic planners in the (thank G-d!) former USSR, who accepted inflated production statistics of foodstuffs and consumer goods at face value, while all those not privileged enough to shop inside the Nomenklatura bubble knew well enough that they were a sham. Likewise, those of us educated in a bygone era realize that the “much greater” HS and college graduation rates of today were achieved by the educational equivalent of puppy milling:

  • the HS curriculum has for most pupils been watered down to meaninglessness;
  • supposedly “native-born and educated” college students often are deficient in basic arithmetic and reading comprehension;
  • a general education at the level we used to get at an Atheneum or Gymnasium [i.e., academic-track high schools in Europe] nowadays requires either a college degree or an expensive private prep school.

But simplistic numerical benchmarks are beloved of bureaucrats everywhere, as they are excellent excuses for bureaucratic meddling. As Instapundit is fond of remarking: the trouble with true gender- and ethnicity-blind fairness — and with true diversity, which must include the diversity of opinion —  is that “there isn’t enough opportunity for graft in it”.

PS: apropos the calling the original author of the essay names that essentially place him outside civil society, a must-read editorial in the Boston Globe by historian Niall Ferguson. His wife, Ayaan Hirsi Ali, knows a thing or two about what real hardcore misogyny looks like, and how useless the Western liberal left is facing it. Moneygraf of the op-ed:

Mark my words, while I can still publish them with impunity: The real tyrants, when they come, will be for diversity (except of opinion) and against hate speech (except their own).

PPS: the Beautiful but Evil Space Mistress weighs in on the controversy and applies some verbal ju-jitsu.

P^3S: heh (via an Instapundit comment thread): 3r06ultwiy725dfbgce3gelzczdktgliwnw8-aldmx0

P^4S: Welcome Instapundit readers!

P^5S: Megan McArdle weighs in (via Instapundit) and reminisces about her own early years in tech.

Thinking back to those women I knew in IT, I can’t imagine any of them would have spent a weekend building a [then bleeding-edge tech, Ed.] fiber-channel network in her basement.

I’m not saying such women don’t exist; I know they do. I’m just saying that if they exist in equal numbers to the men, it’s odd that I met so very many men like that, and not even one woman like that, in a job where all the women around me were obviously pretty comfortable with computers. We can’t blame it on residual sexism that prevented women from ever getting into the field; the number of women working with computers has actually gone down over time. And I find it hard to blame it on current sexism. No one told that guy to go home and build a fiber-channel network in his basement; no one told me I couldn’t. It’s just that I would never in a million years have chosen to waste a weekend that way.

The higher you get up the ladder, the more important those preferences become. Anyone of reasonable intelligence can be coached to sit at a help desk and talk users through basic problems. Most smart people can be taught to build a basic workstation and hook it up to a server. But the more complicated the problems get, the more knowledge and skill they require, and the people who acquire that sort of expertise are the ones who are most passionately interested in those sorts of problems. A company like Google, which turns down many more applicants than it hires, is going to select heavily for that sort of passion. If more men have it than women, the workforce will be mostly men.

She explains how she then moved into a field — policy journalism — that is also heavily male, but that she found she could get as passionate about as her former colleagues about [then] bleeding-edge technology.  Passionate enough, in fact, that she did it for free for five years (under the blog name “Jane Galt”) until she was hired by a major national magazine on the strength of her portfolio. Passion combined with talent can move mountains—or, if you pardon the metaphor, shatter glass ceilings.

P^6S: in the libertarian magazine Reason, David Harsanyi: By firing the Google memo author, the company confirms his thesis and “The vast majority of the histrionic reactions on social media and elsewhere have misrepresented not only what the memo says but also its purpose.” In the same magazine,  Nick Gillespie adds that “The Google memo exposes a libertarian blindspot when it comes to power: it is not just the state that wields power and squelches good-faith debate”.

P^7S: now this is Muggeridge’s Law in action. (Hat tip: Marina F.) I was certain this was satire when I first saw it…

 

The beached whale ORCA: a campaign consultant con job?

I am a firm believer in Hanlon’s Rule (actually Heinlein’s Rule): “Never attribute to malice what can adequately be explained by stupidity/incompetence”, but at times that faith gets shaken. Such as by the ORCA fiasco (see AceBusinessInsiderWaPo , CNETComputerWorldPolitico, and Breitbart) which may very well have cost Romney the election. RedState (via Althouse) now has more on how the campaign acquired this turkey. Go read it all and weep: it is a horrifying story of nest-feathering by consultants, arrogance, and campaign decisions based on bogus statistics and poll numbers from the turkey’s droppings.

The result of all of these false numbers and inaccurate ground reports is simple: Mitt Romney was ill-prepared for the actual numbers on election day and his false sense of confidence directly translated into how the campaign operated in the closing weeks. In the words of one source, it was a con job. As David Mamet famously said, “If you’re in the con game and you don’t know who the mark is … you’re the mark.” Mitt Romney had no idea what was coming.

And thanks to the greed and hubris of a few we are now stuck with four more years of the worst administration in living memory. Thanks for worse than nothing.

UPDATE: More here from Bethany Mandel. And more on the company: seems to consist of a bunch of execs and marketers and… two coders, one of which actually used to work for Al Gore. I might have been able to overlook the latter, but on principle I avoid IT companies that are “all hat and no cattle” (or all jacket and no bomber) like the plague.