Thursday, October 1, 2009

More baby steps in AIDS research

As a follow-up on the AIDS vaccine post, here is an upcoming article from Science about the identification of antibodies active against the HIV virus, identified in the blood of an african donor.

There is an ongoing effort to isolate antibodies that neutralize the HIV virus from people infected with the virus. One of the reoccurring issues with these neutralizing antibodies is that the ones identified so far are very specific against the strain which infected that particular individual, but don't do much for somebody infected with a different strain (or maybe not even a mutated form of the same strain).
Antibodies in general tend to be very, very selective against a particular target, which is desirable for antibody-based medicines, since it reduces the risk for unwanted off-target activity, but not so desirable if you are trying to target something like HIV, where you are dealing with multiple, constantly evolving strains.

Also, with antibodies you mostly observe is that the less selective the antibody, the less potent it tends to be.
Having said that, what makes this article noteworthy is the fact that now for the first time they succeeded in isolating "broad and potent neutralizing antibodies". The two antibodies identified showed potency against 127 and 119 different strains of HIV, out of a total of 162 strains they were tested against.
So this looks very promising and might open the door for antibody-based treatments for people infected with HIV.

To follow-up on the vaccine story: the problem with the AIDS vaccine is that it showed only partial efficacy. Using the knowledge from these antibodies would be just what is needed to come up with a better vaccine, i.e. one that is potent against a broad range of HIV strains.
But the details in the article show how difficult it is going to be to use this as the basis for an AIDS vaccine:
When the team looked into what made these antibodies so potent against such a broad array of HIV strains, they found that the antibodies did not work at all when exposed to isolated proteins from the surface of the HIV virus. They only worked against a trimer that is formed by three proteins on the surface of the HIV virus and they bind to a specific part of the overall surface formed by these 3 proteins.

This is going to make it hard to use these findings for an AIDS vaccine, since you can't tell the human body "Go make exactly this antibody". You can only take something you want the immune system to react to, put it into a vaccine, and let the immune system come up with a way to deal with this.
So even if you manage to put these 3 proteins that were identified (or pieces of these proteins) into a vaccine, there is no telling exactly which part of the overall protein surface the antibodies produced by the immune system are going to bind to.
So when somebody gets the vaccine, the immune system might produce antibodies that are very selective, binding to some other part of the protein surface, and not have the broad spectrum of potency that is needed to fight an HIV infection.

In the article some initial work was already done to identify exactly where and how the antibodies bind to that trimer, let's hope that this can be used to synthesize a protein fragment that would trick the immune system into producing the right antibodies.

... and also hope HIV is not going to quickly mutate its way around this, as it has done before.

Ref.:
Walker LM, Phogat SK, Chan-Hui PY, Wagner D, Phung P, Goss JL, Wrin T, Simek MD, Fling S, Mitcham JL, Lehrman JK, Priddy FH, Olsen OA, Frey SM, Hammond PW, Protocol G Principal Investigators, Miiro G, Serwanga J, Pozniak A, McPhee D, Manigart O, Mwananyanda L, Karita E, Inwoley A, Jaoko W, Dehovitz J, Bekker LG, Pitisuttithum P, Paris R, Allen S, Kaminsky S, Zamb T, Moyle M, Koff WC, Poignard P, & Burton DR (2009). Broad and Potent Neutralizing Antibodies from an African Donor Reveal a New HIV-1 Vaccine Target. Science (New York, N.Y.) PMID: 19729618

Monday, September 28, 2009

The Future of Structural Genomics?

I just came across this article in Science magazine that I believe is an interesting advancement in the field of structural genomics.
A group from Scripps and Burnham Institute including Adam Godzik took the whole genome of Thermotoga Maritima, a thermophile bacterium with a small genome, and modeled all the proteins and metabolic pathways, 478 proteins in total. This also included figuring out for a good number of proteins what the function of that protein is, and then reconstructing the metabolic pathways.
Of the 478 proteins, 120 had been identified experimentally. Of the missing 358 proteins, about half could be modeled with pretty good confidence (i.e. better than 30% homology). Only 3 of the proteins required some major tinkering to get at least a rough idea of what the fold looks like, and the analysis of the fold is what the group is focusing on.

First, I think the fact that two thirds of the structures were either from experiment or from reliable homology modeling is pretty encouraging, but it depends if you are a glass half full or half empty guy.

Second, once they had the enzymatic pathways modeled this way the group then identified a minimal set of proteins essential for the bacterium to survive. They found three groups of proteins: "core essential", where if you take out one, game over for Thermotoga, "synthetic lethal" where there is a built-in redundancy in the pathways such that one protein alone is not essential, but taking out more than one is lethal, and "non-essential" which are, well, non-essential.

The level of detail that the modeling allowed is pretty impressive (I think). I wonder how long it is going to take to go from this to a human or mammalian cell. An analysis like this could have a really big impact in drug discovery, where the preferred thinking of affecting one target to cure a disease is running into a bit of trouble lately, imho. Analyzing networks of proteins this way could really have a major impact on figuring out the best way to affect disease states using polypharmacology.

Of course, going from 478 to 20,000-25,000 proteins is not going to happen tomorrow (or ever?). Knowing how many proteins there are would be a good first step, I guess.
Did we finally figure out how many protein-coding sequences there are in the human genome? I have to check.

Ref.:
Zhang Y, Thiele I, Weekes D, Li Z, Jaroszewski L, Ginalski K, Deacon AM, Wooley J, Lesley SA, Wilson IA, Palsson B, Osterman A, & Godzik A (2009). Three-dimensional structural view of the central metabolic network of Thermotoga maritima. Science (New York, N.Y.), 325 (5947), 1544-9 PMID: 19762644

Thursday, September 24, 2009

Towards an AIDS vaccine, once more

There is some hopeful news that an AIDS vaccine might be possible.

There have been multiple trials with AIDS vaccines in the past, all of which have failed, but now for the first time there is some clinical data that shows that infection rates were in fact lowered by the vaccination. Only by a third, though, so there is still a long way to go, but it is a step in the right direction.
Why, you ask, is an AIDS vaccine so hard to develop?

The reason is that almost all vaccines work on diseases that the human body can successfully fight on its own, mostly. Take for example smallpox, where about two thirds of the infected did recover from it before vaccines against it were developed. So all the vaccine needs to do is to prepare and boost the immune response ahead of time. This is how most vaccines work.

Chronic diseases on the other hand have managed to develop strategies to outsmart the human immune system. No matter how hard the immune system tries, for the vast majority of people the immune system can not clear the infection. So simply boosting and preparing the immune system through a vaccine does not work here, since there is no adequate immune response in the first place.

This is the reason why for this AIDS vaccine a different strategy was used. The vaccination consists of two vaccines, each of which works through a different mechanism. One of the two had been tested before and was found to not work well enough on its own.
The downside of this study is that the vaccines did specifically target HIV strains circulating in Thailand, where the study was performed, and even then only partial protection was achieved.

So maybe as for the AIDS drugs, the way forward for an AIDS vaccine is to have cocktails of different vaccines that complement each other.

If (big if) these initial results hold up.

Monday, September 14, 2009

Objects in the rear view mirror

There is an article in Businessweek about "How Science Can Create Millions of New Jobs". The article then goes on to lament how basic research exemplified by stalwarts like Xerox PARC and Bell Labs has declined and therefore needs a shot in the arm to create the jobs of the future.

I think the article misses the point by quite a distance. The question is not where did growth come from in the past and how we can revive these glorious times. The real question, I think, should be: Where is the current basic research going strong, inside and outside of the US, and in the areas where the US is lacking, what can be done to boost research?

The flaw the article makes is to only mention basic research in telecommunications and computer hardware, two areas which admittedly have driven job growth globally for more than a decade. But I think the cutting edge of this area and the big effect on employment is over. There is always room for surprises, but I think the fact that the likes of Xerox PARC and Bell Labs got downsized is not the cause of the drying up of basic research, but an effect of getting less and less money for a dollar spent on basic research in these areas.
So instead of looking at the past, lets move on, where is the new new stuff coming from? One way is to look at the areas where a lot of money is spend on basic research by industry and the public:
  • Biotech, including bio-agriculture
  • Medical research, for example analytics, medical devices
  • Alternative/green energies, including things like battery technology, solar power, electric vehicles/hybrids, biofuels
In the first two areas the US is one of the major players, if not THE dominant player. But for a long time the US missed the boat and the potential for economic impact energy efficiency might bring. Americans insist on their god given right to drive oversized cars that consume bathtubs of gasoline, which is fine. But the rest of the world has gotten smart about energy efficiency and alternative energies. In the areas of hybrid cars, wind turbines, solar power, and biofuels the US is playing catch up with countries like Japan (no surprise here), but also Denmark, Brazil, and Israel, indicating that the playing field has been wide open for smart new entrants.

Fortunately the current administration seems to be willing to pour money into these areas now. The Xerox PARC and Bells Labs of the future are called DOE and Scripps (DARPA is still playing a big role).

The Businessweek article states "With upstream invention and discovery drying up, innovations capable of generating an industry have thinned to a trickle." I think the author is missing the torrent of innovations that have been happening while he was staring at his rear view mirror.

Thursday, September 10, 2009

Research on the bleeding money edge

Here is an article from the Pittsburgh-Tribune Review about a lawsuit by a privately owned biotech in Seattle, Onconome, against Dr. Robert H. Getzenberg of Johns-Hopkins and U of Pittsburgh.
Getzenberg and Pittsburgh U hold patents for some cancer-related biomarkers stemming from Getzenbergs research. Onconome was funded to commercialize this finding and Getzenberg was the CSO of this company until 2008.
The lawsuit claims that Getzenberg made the whole thing up and the biomarkers never worked.

So yet another scientific fraud, maybe, but it brings up a good question: How do you find a balance between taking a wait-and-see approach on things, but risk that somebody else gets in early and reaps the rewards, versus get in early on some scientific discovery, with the risk that there might be some nasty surprises.
It doesn't have to be fraud, there are plenty of things where VC funds, biotechs, and pharma routinely spend a lot of money on things that do not stand up to the initial high expectations, aka hype (the human genome comes to mind). Some of these things do come back once the expectations are adjusted and turn out to be useful.

What I find mildly funny about the lawsuit is that Onconome is suing U of Pittsburgh for "failing to properly supervise Getzenberg's research". Never mind that Getzenberg worked for 7 years or so for Onconome and produced scientific results showing imaginary progress on commercializing the biomarkers, apparently without proper supervision as well.

Monday, June 22, 2009

Disrupting Pharma: Personal medicine, maybe too personal?

One of the chapters in "The Innovators Prescription" by Clayton Christensen et al. is about how pharmaceutical research is going to work in the future.
One good thing about the book is that the authors clearly distinguish between what they call "precision medicine" and the current buzzword du jour "personalized medicine". For them precision medicine is about using technology to go from intuition-based medicine to a clearly analytical and data driven approach, where diagnostics, biomarkers and such are going to play a major role, and not the individual experience and, well, intuition of the MD you are talking to.

Instead of developing a single "block buster" drug that is going to do a little for a whole lot of people, they believe that the drugs of the future are going to be targeted on sub-populations of people for which the impact is going to be much larger, since the drug is going to affect pathways with clear clinical significance for that sub-population, i.e. a big bang for a smaller group of people than the ho-hum effect of the block buster.
If you have a drug that works well for a certain pathway, all you need is a diagnostic test to check if that particular pathway is the right one to target and you can be pretty sure the drug is going to work (the precision medicine part).

Question is: How do you identify the targets and the sub-population initially?
This is where the term "personalized medicine" and genotyping are typically used. The thinking is that all you need to do is to look at the genes of patients, and the variations therein, and presto, the differences tell you for which pathway to develop a drug for.

There is one little problem with this: The emerging research indicates that the sub-populations might be smaller than you think, and the big bang might also not be what everybody hoped for.

The story goes like this:
When researchers currently look at variations in the genome, they look at single changes in the sequence, a so-called SNP. You take a whole bunch of people, some healthy, others suffering from some ailment you want to investigate, and you look at known SNPs in the two populations, trying to find disease relevant SNPs.
In the ideal case, one of the changes occurs only in the sick population, but not in the healthy.

In reality, the ideal case never happens.
The more experience researchers get in running these studies, the more it looks like that the sub-populations are very small indeed, or alternatively, if you find something, a lot of people that have that SNP are perfectly fine.
Even for diseases like Schizophrenia, where it is known that genetic predisposition is increasing the likelihood of suffering from it dramatically, the most frequently associated SNPs identified so far can only explain a few percent of the Schizophrenia cases.

There might be a good explanation for this: If a large percentage of a population suffers from a disease that affects their chances of survival, over time evolutionary pressure would eliminate that sub-population, or would lead to other changes in the genome that would make an individual more robust for the effects of that single change.
So after a while of evolution running its course, what is left are disease-related genetic changes in the gpopulation that either occur only rarely (affect only a very small portion of people), or are not very significant (people have that SNP, but some other pathway can compensate for that change and the SNP in itself can not really help distinguish between the sick and the healthy).

This might explain why for some relatively new diseases, like HIV, the analysis of SNPs works well, but for "older" diseases, like hypertension, the analysis of SNPs has not really worked that well, since evolution had a chance to take care of this.

Since pretty much every pharma company on the planet seems to be jumping on the "personal medicine" bandwagon, it might be a challenge to come up with a business model that would work for a truly, very personal medicine.

References:

Need, A., Ge, D., Weale, M., Maia, J., Feng, S., Heinzen, E., Shianna, K., Yoon, W., Kasperavičiūtė, D., Gennarelli, M., Strittmatter, W., Bonvicini, C., Rossi, G., Jayathilake, K., Cola, P., McEvoy, J., Keefe, R., Fisher, E., St. Jean, P., Giegling, I., Hartmann, A., Möller, H., Ruppert, A., Fraser, G., Crombie, C., Middleton, L., St. Clair, D., Roses, A., Muglia, P., Francks, C., Rujescu, D., Meltzer, H., & Goldstein, D. (2009). A Genome-Wide Investigation of SNPs and CNVs in Schizophrenia PLoS Genetics, 5 (2) DOI: 10.1371/journal.pgen.1000373

McCarthy, M., Abecasis, G., Cardon, L., Goldstein, D., Little, J., Ioannidis, J., & Hirschhorn, J. (2008). Genome-wide association studies for complex traits: consensus, uncertainty and challenges Nature Reviews Genetics, 9 (5), 356-369 DOI: 10.1038/nrg2344

GIBSON, G., & GOLDSTEIN, D. (2007). Human Genetics: The Hidden Text of Genome-wide Associations Current Biology, 17 (21) DOI: 10.1016/j.cub.2007.08.044

Sunday, June 7, 2009

Zapping malaria

The WSJ and The Economist already reported on this one. But better late than never, I just can't resist posting this:

Intellectual Ventures, a company founded by ex-Microsoft guys, is working on a new way to limit the spread of malaria by taking aim at the mosquitoes that spread the disease - literally.

The company is working on something that sounds like a parody of Reagan's "Star Wars" missile defense initiative: A high-tech "fence" that can distinguish mosquitoes from other things (like, for example, humans) by the sounds they make and kills incoming mosquitoes mid-flight using lasers.
Or, as the company puts it so eloquently on one of their web pages: "Shooting mosquitoes with frickin' lasers"

It is definitely an entertaining idea. But, most of the spread of malaria affects regions where people have trouble affording insecticide-treated bed nets costing less than 10$, so I am not sure how big the market for this is.

Saturday, June 6, 2009

Disrupting healthcare for good

I recently finished the book "The Innovator's Prescription" by Christensen, Grossman, and Hwang.
For those who might not know this, Clayton Christensen is the author of the classic book "The Innovators Dilemma" and coined the term "disruptive technology", explaining how even well run companies can disappear quickly if they miss the threat from cheaper and rapidly evolving technologies.
For this book, Christensen, a Harvard Business School Professor, has teamed up with two MDs, Grossman and Hwang, and takes a stab at healthcare.

This time around it is not about warning managers of companies within our current healthcare system that they might go the way of the steam engine manufacturer. Just the opposite, the aim is to figure out how the current healthcare system will meet the same fate the steam engine did: to be replaced with something much more efficient and cheaper.

The problem with our current healthcare system is that there are no free-market forces at play, and the whole thing is something a central planner from the former USSR might be proud of, or rather not be proud of. It is so bad.

The strength of the book is the analysis of the current situation from an economic/business viewpoint. A good number of the proposed solutions are driven by technology, which has its weak points in some places. The problems of the biggest pieces of the current mess are analyzed: hospitals, medical education, pharma companies, regulators, insurance companies.

Overall, it is a thought-provoking book, and having read this analysis, it is surprising how little of the current "healthcare is an economic problem" discussion is based on (good) economic arguments and business thinking.

I will have some more posts about this in the future.

As a side note: "The Innovators Dilemma" was also mostly analysis. However, Christensen made one prediction in the book: That electric cars will disrupt the automobile industry.
The book was published the same year the Toyota Prius came on the market in Japan, 1997, and the Prius would not be introduced to other markets until 2001.

Did anybody at GM or Chrysler read the book?

Friday, June 5, 2009

A shift at Pfizer

There is an early access article at Drug Discovery Today (via ScienceDirect) titled "New working paradigms in research laboratories" by two researchers at Pfizer.
There have been all kinds of news and rumors about reorgs at Pfizer going around for quite some time now (check for example here and other posts at pipeline.corante.com).
Maybe this article might offer some clue.

So, according to this article, what is Pfizer up to?

First, the article is only about the in-vitro screening part of the business, and without stating so, I think we are really talking high-throuput screening (HTS) here.

And the new paradigm: working shifts.

One of the perpetual problems of large pharmaceutical companies is to find a balance between doing basic research, where a certain amount of chaos, err, I mean creativity is needed, and running a well-oiled industrial process where you put money in at one end, and pills that earn you even more money pop out at the other end. In a regular, predictable manner.

HTS is probably more on the industrial end of the spectrum of the various activities that are going on in a pharmaceutical company. So looking at this from a resource management and allocation point of view makes some sense. Working in shifts probably really does allow to run more screens and utilize all those expensive robots better.

But, reading through the article, there are quite a few changes that were made to enable the shift work (which is two shifts, from 6am to 2pm and 1pm to 9pm, alternating every week):
- unhindered access to equipment
- ability to focus without the distraction of meetings
- hand picking the staff
- new processes which were trialled in the team

One can only wonder how much of the increase in productivity really came from working in shifts and how much came from these other four points.

Item number 2 is a winner in my book.

Wednesday, June 3, 2009

Burning biomass better than converting it to biofuel?

Well, it seems to be more efficient.

This article and the companion editorial in Science magazine compare the efficiency of powering cars using biomass converted to ethanol in combustion engines versus burning the biomass to generate electricity that is then used to power a plug-in electric car.

Their conclusion: With current technology it is far more efficient to burn the biomass and convert it into electricity. For biomass-to-ethanol less than 10% of the original energy stored in biomass is available to power the vehicle. For biomass-to-electricity the numbers go up to 20-25%.
The study looks at conversion efficiency from biomass to transportation only. What is missing is things like environmental impact of old batteries, but also things like using the excess heat for residential heating, which is used quite successfully in Europe.

Also, and importantly, a plug-in electric vehicle would allow to decouple the energy source from the intended use of the energy. Electricity from biomass, sun, wind, coal, atom - everything would feed into the same grid and end up being usable for transportation (or something else).

The authors estimate that the current grid is sufficient to charge up to 70 million vehicles overnight, but fueling this number of cars with 60 billion gallons of biofuels would require additional infrastructure and would require much more land being dedicated to biofuels. Using biomass-to-electricity to power that same number of cars would require only little additional farmland.

Now, burning things has gotten a bad reputation lately, so remember that the only CO2 released would be what was captured by the plants when they grew. If you start to sequester the carbon-dioxide produced during burning, you would in effect remove CO2 from the atmosphere.

So, considering current technology, where would you put your money?

Discomfortingly, the current focus at DOE and USDA seems to be on biofuels. In 2008 an estimated $9 billion was dedicated to investments and tax advantages in that area, which could go up to $30 billion under current legislature (these numbers are from the editorial).