Saturday, February 28, 2015

Bite-sized chunks

I draw satisfaction and energy from accomplishing what I set out to do. The sort of work that I find most satisfying is the sort of work where I have a specific objective to accomplish, where I can sit down, work on it for a reasonable duration of time, and finish it before I get up. When I am happiest and most fulfilled is when I have this sort of work, and I also have the freedom to get up and do something else for a while once I've finished doing what I set out to do. The kind of work that I find most frustrating and draining is the kind of work with no end in sight. I hate working on massive projects during which I wake up, go in to work, grind away at a problem, make some limited progress, and think about what all I need to finish before the problem is taken care of, think about the pace at which I'm currently making progress, and realize that months will pass before I'm finished.

Every project can be divided into bite-sized chunks, so that I could, in principle, always create a specific objective for each time I sit down and keep that objective to something manageable. However, not every distribution of bite-sized chunks works to produce satisfaction as opposed to frustration. For work to be most satisfying to me, the optimal chunks need to be self-contained. If I have to start out each sitting reviewing what I've already done in the previous sitting no matter, how clearly I can define intermediate sub-goals, I am still working on the same project. The ideal project is one that has a start, a middle, and an end with  no connectors in between. Where I can begin without worrying about the stuff that I've already done to prepare for doing this project, and I can finish and stop worrying about everything I've just finished. I don't think the apprehension or satisfaction is derived from the work itself per se. I think it's largely drawn from the time in between when I start work and when I finish it.

An important corollary for me, when dealing with large projects is that large projects cannot be interrupted. I need to dive in, start working, keep working, and then finish. If a project takes me weeks or months, I need to have the ability to make work my life for those few weeks or months. However, I am not at all willing to make work my life in general. So for me to do well at large projects and derive satisfaction from my work, I think the thing I am most well-suited to be doing is binging intermittently with rest and recreation in between. The typical workweek could, at least in theory, be thought of as having this feature. You grind for five days and then rest for two... assuming you actually grind for the full five days. I've found that in office life, most people quickly find that the most productive course of action for promoting the combined benefit of their personal sanity and their personal career advancement is to keep themselves mostly in first gear and to learn not to care about results. This is not a condemnation of the average worker. It's a condemnation of the average workplace. At most workplaces, being significantly more productive than your coworkers is simply not rewarding. But the real problem with the workweek is not that most people are just putting in the hours without actually putting in the effort that makes those hours valuable. Instead, it's that the grind and interruptions are unrelated to the work. The weekends do not coincide nicely with the ends of projects. The schedule of the office during the week does not necessarily fit well with working on a project either. Everyone is involved in all sorts of meetings, etc. that take place at certain prespecified times and are not necessarily related to the ongoing projects. Office hours are eight to five and if you work late today you are still expected to be in at eight tomorrow, and if you work extra hours all of this week, you're still expected to work your normal hours all of next week, you have a specific time window for taking lunch, and if you don't have lunch then, you're not supposed to leave the office for more than fifteen minutes just to fetch something, which you can eat at your desk. After about two, you're really not ever supposed to be away from your desk for more than a five minute stretch of time unless your at a meeting, or unless you have a doctor's appointment or something like it that you've already notified everyone in advance you would have to take a break to attend, etc. etc. etc.

For some types of work this schedule makes some degree of sense. There are many jobs where a significant part of your job description is to be there to answer the phone and be able to have your attention fully devoted to the phone call whenever someone calls you. There are many other jobs where your job mainly consists of reaching out to people during normal business hours so that you can build deals and build relationships on behalf of your company. These sorts of jobs should be based on a schedule, because they must be. We need scheduling conventions to help people figure out when it is or isn't a good time to make a call. Interestingly, all of these jobs, or at least all the ones I can think of, are also well-suited to being based on a schedule. The job naturally fits into bite-sized chunks. You research and prepare one phone call or one presentation, and then you research prepare the next one. You meet the needs of one client who has walked into your store, you complete the transaction, and then you move on to the next client. You're not editing and refining any of your past work.

Work like writing, engineering, software development, and even producing artwork, is nothing like these kinds of work, with respect to how tasks relate to time. The schedule that would (if my experience in working with projects is anything close to typical and all of the research I am familiar with on the subject of flow and engagement suggests that it is) best fit the kind of work that artists and engineers to do is to plunge, then vacation. Plunge and then vacation. You might work 100 hour weeks for a month until something has reached a state of completion that allows you to forget about it. And if you do that, you should then be given a month off (or more, if you are trying to preserve something close to a 40 hour work week, on average). Why? Because nobody wants to make work his or her life, and nobody wants to constantly have incomplete work hanging over his or her head during his or her time off. We want and crave completion, and we want and crave the ability to live our own lives and experience fun and interesting things. You really get neither of these things when you try to work on engineering projects with the traditional schedule, and that's a horrible shame. It's why I'm done with software development, unless I find a position that allows me to work at it pretty much on my own terms. I'd also discourage most smart young people from going into the field. It might pay well (actually, on an  IQ-adjusted basis, it really doesn't pay well if you're smart), but it is not a career path that is conducive to experiencing a satisfying life. If the world needs more engineers, the world needs to make careers in engineering more satisfying. This isn't a problem that has anything to do with engineering itself. Writing code is fun. I'll still write code for myself, in the future. It's a problem with workplaces, management, conventions, and schedules. All of which can be changed simply by people making appropriate decisions, and none of which require much work to implement. If humanity can't get those things right, it's not ready to tackle problems that are actually challenging.

Friday, February 27, 2015

Is it possible to write entertaining fiction about victorious lawful good?

One of the things that happens when we live in a society full of entertainment, whether that is epic ballads, books, or movies, is that many of our experiences occur through fiction. Many of us have only experienced things like government, law enforcement, and prison through entertainment, and for most of us the bulk of our entertainment experience is fictitious. The story, an art form optimized to be engaging rather than an art form optimized for consistency with reality, often takes particular forms, dictated by plot considerations rather than dictated by reasonableness. Why did the society discussed in The Giver decide to eliminate the perception of music and color from the human experience? The backstory of this world is full of wars and famines from which humanity has been re-engineered for its own protection. But what sort of wars and famines could possibly occur that would cause people to decide that music was the problem? I can't imagine that context, it doesn't seem at all reasonable. Similarly, why did the society decide that it needed to instill honesty as the supreme virtue in all of its children, knowing full well that it was going to assign most of the children to roles in which it would ask them to lie to the rest of their community. I can see why most of the patterns that this society has constructed make sense, for example, I can believe that some circumstances would cause people to decide that they need to all people to take medication to chemically castrate themselves as soon as they begin to experience the first tinglings of sexual awareness. I can imagine a society deciding that it needs to restrict the number of children per family to exactly two and practicing eugenics to keep the appropriate number of people in the population. It makes some limited degree of sense that this society might also abstract away the responsibility of childbearing and assign it to particular birth mothers while allowing no one else to procreate. It seems plausible to me that a society that decides to practice eugenics at birth would also decide to practice euthanasia at the end of life. All of these decisions seem like a plausible outcome from the backstory of this world, but then why, then oh why, does this society develop its particular form of education and instill in its children particular naive beliefs of how life should operate that are diametrically opposed to what they will experience in their adult lives? The answer is of course plot. Lowry didn't need to tell the story of how her world came to be exactly how it came to be, she only needed the plot to remain self-consistent with the world she gave us, and she only needed to provide enough detail about the past to enable suspension of disbelief. So we are presented with a world that is deliberately designed around ethics antithetical to the readers, but in which the reader is first steeped in a belief system that takes their own naive ethics and multiplies it to an extreme. The purpose of this educational system is to make the plot work, so that we believe that the main character would have seen this world through moral lenses very similar to our own, even more strongly than our own moral lenses. So that he might make an act of courage that we would wish him to make, even though we realize the we ourselves would be unlikely to make it. And why are there no colors, and why is there no music? Because we must be given a sense of horror. The idyllic setting at first seems lovely and enticing, without a sense of horror there would be no story, and without some unbearable sacrifice there would be no horror. Yes, we would see that this world operates on a system of ethics that is completely contrary to normal modern beliefs and we would also see that this world operates on a system of ethics directly contrary to what it teaches, but we would also see that the way it operates works. Even more damning for us modern moralists looking from our modern vantage point before the story begins, we would see the problem of our own world view, the way that unchecked population growth leads to starvation, and all of the irritating complexities of living in the modern world. If this world had color and music, it wouldn't be a distopia, it would be the Shire. So the world cannot have color and music.

Futuristic fictions must have its distopias because the plot demands it. If the Ministry of Love didn't torture all dissenters into conformity, and the government didn't forbid passion and romance in the lives of its subjects, the story of 1984 couldn't be told. If Longshanks didn't give his representative in Scotland the right of prima nocta, Braveheart would be the story of a lunatic. For stories of warfare and rebellion to connect with the audience, they must hate the enemy. But why isn't the enemy ever the poorly organized and ineffective insurgents, why must it be the government? Again, the answer seems to be not so much the moral alignment of the authors or even necessarily the expected moral alignment of the readers. Instead, it's again the sake of the plot. To be a hero, the antagonist must overcome overwhelming odds. We don't tell stories of the really great guy who happened to be a giant killing a much smaller and weaker fiend any more than we tell stories about a little boy swatting a mosquito.

So in the stories we tell, powerful governments are necessarily evil and big businesses are necessarily corrupt. Evil is always set up to win, and virtue is always set up to lose, and only through the hero's virtue, courage, genius, and luck does good finally prevail. We live in a world where power is evil and wealth is corrupt not because reality requires it, but because this condition is required by our fiction. And most of us experience wealth and power, not through reality, but simply through fiction.

Wednesday, February 25, 2015

Never become a perfectionist (except when it's helpful to do so)

I spend a lot of time thinking about something I call the treachery of childhood. My experiences in life so far strongly indicate to me that the ways I was trained to think and behave as a child are strongly counterproductive. For anyone who failed to be a sufficiently rebellious child, becoming a functioning adult has a lot to do with unlearning the habits acquired in childhood. I don't regret remaining sober throughout high school. I don't regret remaining a virgin throughout high school. (I have some slight regrets about not making use of my first few opportunities to lose my virginity in college, but I think I would also have regretted the opposite had I been in that situation.) I don't regret doing my assigned readings, doing my homework, and doing well in school. I don't regret not having shoplifted when I was a teenager. I think all of those decisions were individually good decision. What I  do regret is being the sort of person who would make all of those decisions. I regret having been a good kid, because there is so much wrong with being a good kid. I regret not having done enough thinking for myself. I regret having been too willing to follow the advice of my elders and abiding by the moral customs of the culture that raised me.

All of the things I just listed are the reasons why it's bad to be a bad kid. Some of the things that people can do have consequences that people eventually regret, and everyone who lives with those regrets tells other people not to do those things because they wish they'd never done them themselves. The lessons that those types of experience teaches it that some of the advice that your elders give you and that some of the advice that is part of the moral custom of your culture is good advice. However, some of the advice is also bad advice. Good kids who follow the rules and do what they're supposed to do make mistakes whenever what would be beneficial to do differs from what they are supposed to do. Bad kids who break the rules and do what they're not supposed to do mess up whenever what they're supposed to do actually is beneficial. Both of these policies are idiotic.

What is also idiotic is that both of the adherents of both of these notions advocate think for oneself. Those that give advice advocate "Thinking for yourself" and those that advocate being a rebel advocate "Thinking for yourself." Actually, I think that this is a big part of why I failed to think for myself. I always viewed myself more as a thinker than as an individualist, so I did what the people who emphasized thinking suggested instead of what the people who advocated being myself suggested.

As always, the problem is that I didn't yet have any understanding of statistics when I was six years old or however old I was when I was first forming a strong sense of my own identity as a person and picking sides in this charade.

If you are actually thinking for yourself, you will sometimes decide that some of the rules and routine advice you receive is good. When that happens, you will follow it. You will also sometimes decide that some of the rules and routine advice you receive are stupid. When that happens, you won't follow it. Whenever multiple factions continue to exist, it is sensible to assume that neither of them have a monopoly on good information. If the rebels weren't doing anything right, all of the rebels would grow up to be failures and people a little bit younger than them would be able to see that being a rebel would soon become incredibly uncool, so they should grow out of it ASAP. Of course, this doesn't happen. A lot of rebels go onto become great successes. Actually, disproportionately many of the extremely successful people in the world were rebels in their younger years... in many cases, serious rebels, who were too extreme to be mainstream like Steve Jobs whose time as a hippy instilled in him a hatred for the tyranny of basic hygiene like showering, or cofounders of paypal, the majority of whom built bombs in high school. Somehow, I don't think that not showering and building bombs are the things that they did that helped them become more successful. What they did right is that they didn't listen to bad advice.

Anyways, if I had truly been thinking about what seemed right to do and trying to do what seemed right, I should have developed a personality that was a combination of the different trends, as well as some things that nobody else was saying. If neither side has everything wrong and neither side has everything right, then the optimal strategy whatever it is, should have some things that each group says to do, and some things that each group says not to do, plus some things that nobody talks about.

The perspective of rebelliousness isn't really worth analyzing, so I won't analyze it. One can learn everything worth learning about the perspective of rebelliousness by analyzing the things that the rebels are rebelling against, and trying to figure out what portion of that perspective is worth keeping and what portion of that perspective deserves to be thrown away. The good parts of the traditional moral perspective are pretty obvious: don't do all of the stupid stuff that people do as teenagers and later regret in life; don't take stupid risks; and don't become a total hedonist. Driving 100mph on roads that were not meant to have people drive faster than 45 on them is a bad idea. Failing to get enough sleep is unhealthy. (It's unhealthy whether you fail to get enough sleep because you stayed up late studying or because you failed to get enough sleep because you were out late partying.)

The horrible advice also tends to fit into a few broad categories. The one that I've been thinking about most recently is having high standards. Sometimes, it's useful; usually it's not. I love hanging out with people who are frequently pleased and who frequently get excited. Being someone who is genuinely pleased and excited by things that other people regularly do is so rare and so beneficial that it's practically a superpower. It also has to be something that makes your own life way more pleasant.

A similar sin that someone can have oneself is wanting to have a finger of Midas. I want everything I attempt to turn succeed and everything I touch to told to gold. It's crippling, and it leads to all sorts of poor decisions. I wish I'd decided four years ago to just take more risks, begin presenting my thoughts to the world, do a bunch of things that would have likely led to failure and possibly led to success. Four years from now, if I don't start doing it now, I think I'll still have the same regret. I had the same regret two or three years ago, and decided it  was to late to change. Doh! That was a horrible decision. Here's some good advice: don't become a total hedonist. Have fun though. Delay gratification enough that you are confident that you are building a life that will be better five years from now than it is today, but don't delay it so much that you cease to enjoy living. Also, don't delay gratification without reason; becoming ascetic and depriving yourself of something when you don't know why your doing it isn't going to help you with anything. When you delay gratification, you should do it for a specific reason with a specific belief in mind. I'm not going to buy X today because I want to save money so that I can do Y tomorrow... where Y might be something as complicated as "save up $20 million so that I can spend a thousand a day living off my income while still having my net worth increase faster than inflation assuming I don't utterly fail at my attempts to invest it wisely." Unless you're planning to buy a space elevator, delaying gratification in order to save up more than $20 million is not actually delaying gratification, it's foregoing it. It's asceticism at that point. Not many people are in that position, but many people would rather have a 10% shot at becoming a billionaire than a 50% shot at accumulating $20 million, but unless you have your heart set on collecting Picassos and Monets or you want to build your own private island in the middle of the Pacific or you want to make the biggest donation in the history of some organization that you care about contributing to, it's hard to imagine why. There's practically no differentiation between the way you can live if you're able to spend $1000 a day from the way you can live if you're able to spend $50,000 everyday.

Getting back to bad advice, according to the Bible's book of proverbs, "A man of many friends comes to ruin."  So pick your friends carefully and have a few good friends. That's horrible advice. It's hard to even think of worse advise. Okay, that's a bit of an exaggeration: "go commit murder in a public place with lots of people watching and stand around afterwards to gloat" is pretty easy to think of, and it's worse advice. Having many friends is better than having a few good ones. Some people, mostly people steeped in traditional wisdom, disagree, but the evidence is squarely against them, assuming that you treat the success and happiness of the people who adhere to advice as a good marker of whether or not the advice is good. While I received mixed advice from my elders about whether it was better to have many friends or a few friends as a child, my elders were uniform in advising me and the other children I knew to be careful about who we befriended. Some people advised me to find a few of the right sort of people; others advised me to find as many of the right sort of people as possible, but they all strongly cautioned against becoming friends with the wrong sort of people. (This is not to say that everyone who gave me advise as a child gave me this advice or believed in it; some people never advised me one way or the other on this subject. Some of them, I would guess, remained silent on this subject because they knew that their viewpoint disagreed with the dominant one in their community.) Of course, the right sort of people were people who had a similar background to ourselves, who followed the same advice that we were being given, who wouldn't tempt me to question whether I should be doing the things that I've been told I should do. This is even worse advise than telling someone to have few friends. People can't learn anything from talking to a mirror, and the closer their friendships come to approximating that experience, the less they will be able to learn from them. Moreover, people's influence is measured much better by the breadth of the cross-section of society that they have formed close associations with than it is by how many people they know. Influential people tend to know retirees and students. Influential people have connections to politicians, criminals, and prison guards, protesters, and the people do the things that protesters are protesting, slum lords and their tenants, journalists and engineers. The idea of popularity separated from everything else is just fame: being Paris Hilton. The idea of being well connected separated from everything else is the idea of influence: being a mob boss. 

The American experiment, i.e the type of democracy that first appeared in America, owes in my opinion a great deal of its success to the development of classnessless, of refusing to have standards. 

When I think about things I've gotten wrong in the past few years, that I need to correct going forward, I think the biggest overarching mistake I've been making in my life is trying to be too selective in general (not just with respect to forming connections) and looking for too much perfection in the immediate term. Perfection, like gratification is something that often needs to be delayed. In the short term, there are a few forms of perfection that are worth striving for: perfect technique, for example, but that's another subject for another time. Sometimes, having low standards is a good thing. All of the people who are natural rebels already figured that out a long time ago. It's one of the things that they got right. For people like me, to whom rebelliousness, for all the intellectual appeal it sometimes had, never came naturally, this realization clashes with entrenched beliefs and heavily trained tendencies. To correct for my natural tendencies, I think I need to try to have low standards except when I have a good reason to raise them. My natural tendency to try to be selective gets in my way.

Sunday, June 22, 2014

Larva and metamorphosis (Part 1)

One of the positions that Margulis generated a lot of controversy by supporting (though not necessarily stating that she thought it was true) is Donald Williamson's proposal for the evolution of metamorphosis in crustaceans and insects. He suggests that the larva and adult stages of many metamorphic animals descended from unrelated common ancestors that hybridized to create the species with metamorphic life cycles that we are familiar with today. Upon publication, Williamson's positions were met with immediate widespread scorn from most of the academic community.

I don't know much about crustaceans, and I don't know whether Williamson has suggested that the same might be true of, say, amphibians, but I doubt that he has. In the case of frogs and salamanders, it's pretty abundantly clear that the standard Darwinist account of natural selection on a single lineage adequately explains the evolutionary process. The metamorphosis really isn't that drastic. Internal organs of the juvenile stages map directly to homologous organs of the adult. Neotenous individuals that never undergo pubescence without metamorphosis can mate with ordinary adults. We can find fish that exhibit similar life cycles and we know of environmental factors that would selectively favor the adaptations exhibited by amphibians. The changes from juvenile stage to adult stage, while they happen quickly, are gradual. A tadpole can become a frog in day, but it doesn't discard it's tadpole shell and pop out a frog. It simply goes through a growth spurt in which most growth occurs in a particular set of limbs. Many fish undergo similarly drastic changes as they age. Some fish change gender. Other fish, for example flounder, have a body morphology that changes from a vertical orientation to a horizontal one. A huge percentage of chordates exhibit some morphological changes other than uniform changes in size in response to hormonal changes as they age. In this clade, it is very common for the most drastic of these changes to occur around the time that the animal becomes a reproductively viable adult.

In short, we really don't need any special explanation of how metamorphosis evolved in amphibians. Some fish developed a tendency to become increasingly suited to life on land instead of aquatic life as they aged. In many ways, the origin of amphibians is a lot simpler than the origin of reptiles or mammals. Some biologists even speculate that multiple lineages evolved independently and went extinct.

Contrast all of these things to the insects that undergo complete metamorphosis. The first drastic difference we observe amphibians and Pterygota, the subclass of insects in which metamorphosis occurs, is that in some members of Prerygota, the juvenile form bares practically no resemblance to the adult. We call these drastically different creatures larva. Fascinatingly, in every case where the distinction is sufficiently drastic for as to call the juvenile form larva, there is another intermediate form in which the insects changes drastically from the larva int the adult. During this phase, it is typically extremely passive. Many pupa perform no normal biological functions like eating or reproducing or, in most cases, even responding to its environment. This phase more closely resembles regression to an egg than it does to any life phase seen in animals. (It remains extremely metabolically active during this time, making hibernation a poor analogy.)

A second observation that follows closely on the first is that the adult morphology predates the metamorphic life-cycle whereas, in amphibians, it was the juvenile state that had a precedent. In terms of phenotype, many insect larva aren't arthropods. They frequently lack joints. They also tend to have extremely thin exoskeletons that exhibit no mineralization, even though practically all adult insects have much thicker, mineralized exoskeletons as do the adults of most close relatives of insects. The oldest surviving order of insects, Archaeognatha, undergoes no metamorphosis; and has the adult phenotype just described. The oldest members of Pterygota subclass of insects, such as dragonflies and cockroaches, species that have remained largely unchanged morphologically for as long as the fossil record has contained examples of Pterygota, likewise have no larval form. The physiological changes that occur in these species as they age are frequently described as "incomplete metamorphosis" but the changes are not particularly drastic. Nymph/naiad and adult tend both have joints and thick exoskeletons. They share adaptation such as the complex jaw structure common throughout insect species. Some share an environment and food source with the adult. They change gradually into the adult form through a succession of molts. (Larva of species undergoing more thorough metamorphoses molt, but their successive instars do not bare increasing similarity to the adult.) In the case of the orders of insects undergoing "incomplete metamorphosis", like in the case of the amphibians, the young tend to resemble recent common ancestors much more thoroughly than the adult does. The resemblance of Archaeognatha to dragonfly naiads is stronger than its resemblance to dragonfly adults. All have the morphological characteristics of insects and arthropods that I've just described, but dragonfly naiads like all (? -- all that I know of --) insect nymphs and naiads are flightless.

The final observation for this post is that the drastic superficial differences between larva and adults in species undergoing complete metamorphosis accompany underlying structural differences. I already pointed out that larva tend to lack some defining characteristics of arthropods, exoskeletons and joints. It should therefore not be surprising that they also frequently lack defining characteristics of insects such as bodies separated into three regions or having six legs. These two observations are not rigid. Beetle grubs have a more adult exoskeleton than other insect larva, especially in the head. Grubs, in general, have features corresponding much more closely to adult insects than do the other insect larva.

More deeply still, internal organs in insect larva frequently lack mappings to homologous organs in the adult.

Finally, there is no accepted explanation for how insect metamorphosis evolved. In amphibians, we have a compelling explanation of the process by which fish acquired the ability to survive no land. Obviously, we don't know all of the details, and we probably never will, but when we look at what we do know, we aren't left with the feeling that much more explanation is required. We understood what happened well enough that, given the ability to accelerate time, we can be pretty confident that we could choose some fish with the appropriate characteristics and subject them to the appropriate environmental factors to produce amphibians.

For insects undergoing metamorphosis, we have no such story. The evolutionary history of insects is not an easy thing to study* because it, in general, does not have anything close to the amount of published and collated research as exists for the evolutionary history of chordates. Particular facets of their evolution of insects such as their divergence from other arthropods, the process of metamorphosis, how they adapted to land, and how they developed flight are not well understood.

Margulis and Williamson were certainly correct in asserting that one of these facets of insect evolution, metamorphosis, requires additional explanation. They may or may not be correct in their belief that some of these things will require a significant reappraisal of some of our core beliefs about how evolution works. Margulis was certainly correct in her insistence that understanding the origin of Eukaryota required a significant revision of understanding. Almost everything she proposed related to endosymbiosis has been vindicated by genetic studies in the last 30 years.

The reaction that Williamson has received is the reaction that has plagued the history of thought and science throughout recorded history. In particular, we often see entrenched academic communities reject the assertion that there current understanding is incomplete, when in fact, everyone's understanding of everything is always incomplete. Williamson has picked a very specific example of something for which our current understanding is grossly incomplete and proposed a radical explanation, one that has no accepted precedent in the study of animals, though hybridization is known to regularly occur in plants.

None of these observations say anything about the particular merits and demerits of Williamson's proposal. I plan to discuss the subject some in the future.

(Full disclosure: Margulis has a lot of unconventional ideas. I strongly agree with about half of them such as the relative importance of genes vs. cellular structure, the relative impact of human vs. microbes on the environment, the computational intractability of reality, and her general stance that it is productive and accurate to view human creation of machines as an extension of biology. As I expressed in my last post, I strongly disagree with her views on meiosis. That is her only opinion in (non-medical) biology with which I strongly disagree. I am also not an AIDS denier. I find the medical community's perspective more credible than Margulis' on that subject. If the symptoms of HIV/AIDS really were just a manifestation of syphilis, it is hard to imagine that no one would have found a way to treat it. There are known ways to treat syphilis into remission. I haven't scoured the literature to try to find out if anyone's ever tested syphilis treatments on AIDS patients. However, it is almost inconceivable that no one has ever been diagnosed with both diseases and treated for the one that is treatable. If such treatments consistently caused the disappearance of AIDS symptoms, someone would probably have noticed and begun testing syphilis treatments on more people with AIDS. I am also not a 911 truther. However, I don't know enough about the subject to have an informed opinion on it, largely because I don't really care. As far as I'm concerned, it's simply a matter of whether to assign primary blame to one party or the other for one particular incident in the increasing conflict between Western elites and traditional Middle Eastern peasants that has been escalating for the past thirty years. For my own part, I think there's enough blame to go around that assigning 4,000 extra deaths to one side or the other doesn't noticeably add to the crimes of either side.)

*I use the word study to mean "read what other people have researched." I use the term research to mean either "investigate through painstakingly detailed observation" or "perform experiments." In general, you can accumulate information much more quickly and assimilate information from a much broader range of sources by studying than you can by performing research. Professional scientists tend to make their living, at least early on, by doing research because that is the part that actually requires work. I am of the opinion that studying results in the formation of more accurate ideas than research does because there is a great wealth of inadequately analyzed research in existence. Generally speaking, great scientists before the twentieth century tend to do their own research, but the most influential scientists of the past hundred years have tended to study other people's research instead. (Notable examples of people in the second category include: Einstein, Heisenberg, Wheeler, Feynman, Hawking, and Margulis. My list is obviously colored by the fact that I have tended to be more interested in physics than other hard sciences (and by the fact that I dislike typing non-ASCII characters).

Sunday, June 15, 2014

Microcosmos: meiosis and parthanogenesis

Microcosmos is an excellent book and its author was a fascinating woman.

In keeping with the name of this blog, I will try to have my first post about any given book focus on the parts that I disagree with. Microcosmos is so full of unorthodox perspective that any disagreement I could have with it, will necessarily be more orthodox than the perspectives in the book. Hence, I will have to revisit some of the topics in order to express how strongly I am persuaded by Margulis's insights.

The only major points in the book that I find highly dubious have to do with its assertions relating to meiosis. Margulis believes that eukaryotic sexual reproduction is maladaptive, and that meiosis is valuable because it provides increased stability by serving as an elaborate biochemical roll call. She disagrees with the belief that it serves to increase diversity.

Many species exhibit parthenogenesis. Some reptiles are entirely parthenogenic. They survive. They exhibit phenotypic diversity within the species. Sex is not biologically necessary even for Eukaryota. Hurrah!

Cnemidophorus neomexicanus is entirely parthenogenic. There are no males in the entire species and the species doesn't mate. It's more numerous than its close relatives, indicating that parthenogenesis can clearly be adaptive if it can be pulled off. It's close relatives mate normally.

The same is true of other species that reproduce parthenogenicly. Their close relatives very seldom do.

Parthenogenesis has clearly evolved independently on many different occasions. It is extremely unlikely that all of those occasions occurred in the past ten million years. Rather it has been happening for as long as their have been animals. Yet these species almost never go on to form clades. Some clades, like rotifers, have many species that exhibit both male-female mating and parthenogenesis. But there is no existing clade of more than five species that is entirely parthenogenic.

The available evidence strongly indicates that parthenogenesis evolves frequently and easily in some classes of animals. However, the species that develop it rarely issue new lineages that survive past the million years or so of environmental consistency that typically constitutes a given species's time in the sun. As the world changes, these species remain the same. Hence, they go extinct. Their mating relatives much more successful issue progeny adapted to the evolving world.

In the case of Cnemidophorus neomexicanus, we know with certainty that the species didn't somehow evolve into its sexually reproducing near relatives. Its two closest relatives can hybridize to create new female, always female, Cnemidophorus neomexicanus. (I.e. this is a hybrid species of two sexually reproducing species.)

Let's pick another species.

Komodo dragons can reproduce parthanogenically, but all offspring so produced are male (which can then mate with the female Komodo dragons to produce female or male Komodo dragons).

Clearly, they aren't leveraging parthanogenesis as much of a survival strategy. There are plenty of males on the Galapagos, and we have no reason to believe that males randomly die out while the females survive. Instead we see that under reptilian reproduction, parthenogenesis is an easy adaptation. Komodo dragons exhibit this trait simply because they can. It doesn't help them survive. It hasn't helped them colonize new territory (they are naturally confined to a very small area). It's simply something they do because Komodo dragons can hatch with a haploid chromosome count, and then develop diploidy as they mature which means that whenever a female can lay unfertilized eggs, the males of that batch can survive. (Komodo dragons do not have X and Y chromosomes to determine gender like humans do. They have ZW chromosomes. Males are ZZ; females are ZW. Only the males survive birth in parthenogenic reproduction because WW is not viable, and they they start out haploid.)

Other animals that display parthenogenesis display it because it has been similarly easy for them to evolve it, not because it is a winning strategy. It might be winning in the short term for some species, but these species that develop extreme dependence on this strategy as a means of reproduction rarely speciate into new lineages, and typically fail to survive the environmental shifts that cause new species to emerge among the mating animals.

If it was easy and adaptive, it would be ubiquitous. We have strong evidence that parthenogenesis is an easy adaptation for many species. We know that it is not ubiquitous. Far more species reproduce sexually. Therefore, we have very strong reason to conclude that sexual reproduction, not parthenogenesis, is adaptive, at least in the long term.

Hermaphrodism among animals and plants would lead us to the same conclusion. Many hermaphroditic species have adaptations that prevent them from mating with themselves, again indicating that sexual reproduction is adaptive for Eukaryota. Hermaphrodism is a very easy mutation. A huge proportion of plants and animals have some hermaphroditic mutants in the species. It's likely that all do. Yet very few species are predominantly hermaphroditic. The clades that are, e.g. many groups of plants as well as some animals like snails and slugs, have also developed traits to prevent autosexual reproduction. Again, this gives us strong evidence to believe that the genetic exchanges from mating produce variety that helps species adapt to changes in their environment and evolve accordingly. Autosexual reproduction is obviously less expensive in the short term, therefore it is in some sense adaptive. It's pretty obvious that animals that can clone themselves without going to the trouble of finding a mate can reproduce to fill their evolutionary niche a lot more easily than their less fully endowed relatives. However, such species rapidly go extinct once their niche goes away and they leave no daughter species after them.

Margulis may be right about a lot of things. I strongly believe she is. However, the biological evidence (not just the entrenched consensus of academia) is overwhelmingly against her in her claim that sexual reproduction is a maladaptive vestige of early Eukaryotic evolution that will vanish when faced with the superior reproductive methods of Eukaryotic species that can clone themselves. Locally, when the ability to essentially clone oneself develops in an individual that is very well-suited to a particular niche, yes, that individual and its clones will go on to out compete their relatives and fill the niche. However, brief survey of Eukaryotic life will quickly tell you that those species will not survive to produce daughter species; whereas the mating organism occupying other parts of the world and its other ecological niches much more frequently will.