Let's Keep It PG

Apr. 25th, 2019 10:40 pm
[syndicated profile] questionablecontent_feed

this might be the first ever instance of Netflix closed captioning as a humor device in a comic

[syndicated profile] in_the_pipeline_feed

Posted by Derek Lowe

This paper comes under the heading of “early days, but possibly of great interest”. It demonstrates room-temperature synthesis of ammonia from nitrogen gas using a samarium/molybdenum system, and chemists of all sorts will sit up and that news and say “Hold it. Ammonia is the Haber-Bosch process, isn’t it?”

That it is. And that’s a reaction that keeps over half the human race alive, through its use in fertilizer production. This is a prime example of one of the absolute foundation stones of modern existence that most people are completely unaware of. All over the world, there are huge reactors running at high pressures and temperatures, full of carefully-worked-out iron-based catalysts, that are pulling nitrogen and hydrogen in and pushing ammonia out at a rate (worldwide) of nearly 300 tons per minute. Until Haber worked out that reaction (and Bosch improved it for industrial scale), the only creatures on Earth that could do that were certain types of bacteria. In 1909, we became the second. Note that those bacteria are also (and still) keeping large swathes of the human race alive, since they’re the nitrogen-fixers found around the roots of legumes. And if either of these things stopped working, the bacteria or our industrial ammonia plants, we’d be looking at mass famines within months with effects on the human population not seen since the Black Death. Both going down at once would have a decent shot at collapsing our civilization.

The Haber process has a reputation of being pretty severe, what with those temperatures (>400 C) and pressures (around 400 atmospheres), but as this commentary on the new paper makes clear, it’s really a pretty good setup. Thermodynamically, it makes very efficient use of its energy input and wastes very little on side processes. The problem is the size of those inputs: running a Haber-Bosch plant is an energy-intensive undertaking, and it’s even more so when you consider the energy that has to go into making the hydrogen starting material as well. A new catalytic route that lowered the energy barriers involved would be very desirable from a power-consumption and carbon-emissions view.

This new reaction doesn’t use hydrogen gas at all – rather, the H atoms in the ammonia product come from water, with the samarium iodide allowing it to serve as such a source. Under normal conditions that’s not such an easy reaction – about 111 kcal/mole for O-H dissociation in bulk water – but in hydrated samarium iodide the O-H bond strength is knocked down to 26 kcal/mole. It also looks like other OH-containing molecules (such as ethylene glycol) can also serve. So the samarium end of the reaction furnishes the hydrogen, and the molybdenum end activates the nitrogen (indeed, it’s an important element in the bacterial nitrogenase enzymes as well). The reaction turns over at greater than 100x per minute, which may not sound too impressive compared to an enzymatic process, but that actually approaches the ammonia production of nitrogenase, because it’s obviously not an easy reaction under any conditions.

What it doesn’t approach is the throughput of the Haber-Bosch, of course, and as it stands the reaction isn’t really suitable for industrial scaleup. But it’s of potential importance, as mentioned, because of the non-hydrogen-gas aspect. This is new way to get ammonia to form, and it already works at least 10x better than any other artificial system that’s not a Haber-Bosch variant, and this at atmospheric pressure and room temperature. Thermodynamically, it still has a way to go when you add up the total energy involved versus ammonia production, but this looks very much like an area to pursue. The Haber plants are going to be with us for a while to come (they’d better), but there could be better ways. . .

 

The Electrons Continue to Beam In

Apr. 24th, 2019 01:27 pm
[syndicated profile] in_the_pipeline_feed

Posted by Derek Lowe

I had the chance yesterday to attend a one-day symposium on Cryo-EM (and MicroED) techniques here in Cambridge. The whole thing was co-hosted by ThermoFisher, whom I gather are having a glorious time selling these instruments and want to extoll their virtues as much as possible, and by MIT. It helps that there are a lot of virtues to extoll. I’ve been writing about these techniques here on the blog as they’ve been developing over the years, and what we’ve been seeing is the rise of a completely new analytical technique – first slowly, then quickening, and now moving at a really impressive clip. Some of that can be seen at the statistics page of the EMDB. That cumulative count of maps released is on pace to keep up the exponential growth shown so far: there have been 761 released so far this year (as I write), and that alone is more than were released in all of 2015. If there really are >2200 maps added this year, as it looks like there will be, the increase during 2019 by itself will be more than the total cumulative number that existed at the end of 2012. The increase in resolution of the deposited maps is impressive as well – the higher-resolution ones used to be a distinct minority, but the gap is closing rapidly.

Last fall I mentioned the applications to structure-based drug design, noting that at the time the review under discussion was written, that there were only five (total) cryo-EM structures at or below 3 Angstrom resolution with small molecule ligands in them. But I just went over to the EMDB and counted four released so far this year (an example), with still more if you count back to the publication date of the review itself, and my impression of yesterday’s meeting was that there are a lot more coming over the horizon. The idea of looking for such ligands has gone from “Don’t bother” to “Let’s try it” and is moving smartly towards “Sure, why would you even wonder” territory. It’s not there yet, but as this equipment gets more capable and more widely distributed, and as people get more experience with it, that’s where we’re headed for sure.

Also last fall, I wrote about the related MicroED technique, which uses the electron microscope equipment to do diffraction instead of imaging. Its applications to things that crystallize are immediately apparent, because the size of the crystals needed to produce a structure are ridiculously small. I had the chance yesterday to hear Tamir Gonen of UCLA (a leader in this field) speak on their recent results and it was quite impressive. Here’s a recent publication from the group to give you the idea. Gonen mentioned that some of the protein crystals that they’ve been working with are only about ten protein molecules on a side, and they’re pulling up to two-Angstrom resolution structures out of them. Try that with X-rays, if you want, but prepare for disappointment. Here’s another paper from the group from last year, showing a 0.75A structure (!) of a prion protofibril, and it has structural features that are invisible to X-ray crystallography in any form, including some electron density in the hydrogen-bonding network that is hard to explain at all with existing models. The small-molecule ED work described in that October blog post is continuing, too, an area that I’m following with great interest.

So where is the field going? One theme that kept coming up was the need to get all of these techniques out of the “artisinal handwork” mode that they’ve been in. Gonen is setting up a center at UCLA that’s trying to automate things as much as possible, and that same thought came up many times. Sample handling and tracking, evaluating particles on the grid (for cryo-EM), doing tiny crystallization experiments on the grid itself (for microED), making sure that you’re collecting useful data in general and not just shooting junk, automatically processing as much of the real data as possible: all of these need help. But my impression is that these problems, though not solved yet, are actively being hammered on by the best practicioners in the field, and there’s no reason that things can’t get a lot easier than they are now.

As Gonen himself put it, though, “I don’t want you going away from this saying “Oh, everything’s done, everything’s peachy, Tamir has it all figured out, we’re going to go get all our structures in an afternoon” We’re not there yet. But what’s been accomplished so far is startling, and there’s a lot more to come.

I should mention, for people in the Boston/Cambridge area, that you’ll have another chance to hear about this stuff soon. The MassBio folks are planning an event on cryo-EM in drug discovery the morning of July 9 (8-10 AM) at their venue in Tech Square. Registration will be open on their web site on May 1, or you can email them for more details.

bathrooms over broadway

Apr. 24th, 2019 02:06 am
wohali: photograph of Joan (Default)
[personal profile] wohali
Ran across this delightful documentary completely by accident. It's about the little-known world of so-called "industrial musicals," that is, fully fledged Broadway musicals that were for company executives or salespeople only. Some soundtracks and video still survive, and they are amazing to listen to. The movie goes through their discovery, then extends to meeting the people who performed in them and wrote them.

Check it out - Bathtubs Over Broadway. You'll find you can't stop humming "My Bathroom" to yourself.

Here's the trailer:



Corporate hagiographies are fantastic, especially in musical form.

(no subject)

Apr. 23rd, 2019 07:41 pm
randomdreams: riding up mini slickrock (Default)
[personal profile] randomdreams
I went into work today to find out that my manager is taking three weeks off for family medical leave starting today. He didn't say anything about this to anyone yesterday.
I'm working on building a whole bunch of wrappers so software language A can call a big binary blob written in software language B. Last time I did this, it took me a week longer than I'd anticipated to get it working, for a large variety of reasons. (The API documentation was wrong, I didn't start asking for help or pointing out that I was going to have trouble meeting my deadline until I was right up against it, and the target hardware documentation was flat-out wrong about things like how to correctly power it.) So this time around I decided that I'd get it working before the "when are you going to have this done?" discussion even came up. I've figured out how to automate it, so I set that off and running and when it finished I emailed my manager that I had a good start on the software for our next project, that as of last Friday was my number one priority that nothing should distract me from.
He emailed me back almost immediately to ask me how an entirely different project, that was my highest priority last Wednesday, was going, and mentioned at the end of the email that he'd decided to cancel the project for which I was writing this software.
I turned to my coworker and said "hey, did you know your next hardware project has been cancelled?"
He sat there gaping like a fish for several seconds. I was all "I guess you didn't know either."

From Industry to Academia

Apr. 23rd, 2019 12:51 pm
[syndicated profile] in_the_pipeline_feed

Posted by Derek Lowe

Academic research and industrial drug discovery have always been on separate paths, but my impression is that the two understand each other better now than they have at any time during my career. That’s in no small part due to the number of industrial scientists who have moved into academia (itself in no small part due to the employment turmoil in the industry). Here’s a perspective from James Barrow (ex-Merck) on what it’s like to make just that move, and it’s interesting reading. There are already guides for academics who are thinking of getting into industry, but not so many in the opposite direction.

If you’re going to continue to do work that bears on drug discovery (which is what many people in this situation will of course be doing), you’ll have to find a way to set up collaborations. A well-equipped industrial setting has its own permanent departments and specialists: you know, for example, that there will come a time when someone will need to investigate the best way to formulate a lead compound for dosing in an animal model, and by gosh, there are formulations people who specialize in just that, in the same way that there are people who specialize in keeping the animal facility itself running smoothly, and people whose expertise is the collection and analysis of the blood samples from those animal experiments, and so on. All of these are full-time jobs in a large industrial setting, and the people involved get quite good at what they do, but only the largest universities will have anything like that in place. The article recommends that newly transitioned industrial scientists make themselves as visible as possible, as quickly as possible, to attract collaborators.

Another thing to get used to is the overall speed of the work. One factor is that you’re also going to be training students, a wide variety of them, and a person has to internalize that fact that they’re no longer working in an environment where everyone involved on the project knows pretty much what they’re doing because they’ve done it before, etc.

The resources available for drug discovery in academic laboratories are more limited than found in pharma. This often limits the pace and throughput through the traditional drug discovery cycle, thereby necessitating careful selection of assays for maximum impact on the project. Laboratories can mitigate this disparity by use of core facilities, judicious outsourcing, and collaborations that often bring unique capabilities and data sets to a project. While collaborations can greatly enhance the resources available to a project, care must be taken to set expectations about how many compounds can be tested by a collaborator and what the turn-around time will be. Nothing is more frustrating for a medicinal chemist than completing the synthesis of a tough analog that will set the direction of future chemistry efforts, only to have to wait months for data.

Indeed. Moving to a nonindustrial environment will really bring into focus how many things you might have been taking for granted – for example, that it used to be someone else’s job to optimize an assay, etc. Barrow also addresses another factor that can show up:

Academic laboratories are generally rewarded for innovative research published in prestigious journals by additional grant funding and prominent lectures. This lessens the motivation to rigorously test key assays for reproducibility, especially critical in vivo experiments. In an academic drug discovery setting, these in vivo experiments are often done by collaborators who need extra reminding about rigorous protocols including randomization, blinding, and appropriate controls. . .Academic scientists who have spent years, sometimes decades, studying a particular disease process are often reluctant to do the “killer” experiment.

And while I’m sure that that’s not always true, I’ve also seen that exact effect in a couple of academic collaborations that I’ve been involved with. My guess is that working in industry, you get used to the way that there’s always another project coming along. The one you’re on now will work, or perhaps not work, and then you’ll move on to the next one, because that’s what you do. But for an academic lab, working on that single project (broadly defined) may in fact be “what they do”, and in extreme cases a go/no-go experiment might be seen as more of a threat than an exciting opportunity. (In a different way, I caught myself doing that in the last stages of my own PhD work – I was nervous about finishing up, and was finding ways to do other stuff and run other experiments rather than get down to the crucial things that would get me out).

I’m sure that there will be academic scientists who will take exception to Barrow’s comments above. It’s not like everyone is turning out sloppy work, of course, but it is true that the incentives are different. Keep in mind that when an interesting result or phenomenon is found in an industrial setting, it generally gets hammered on pretty quickly from several different angles to make sure that it reproduces, and under what conditions. No one will (or no one should!) continue to put effort into a project unless it can stand up to a good shaking of that sort. After all, a preclinical industrial drug effort is aimed at making a decision to spend a very large amount of money to give your new therapy to other human beings, in the hopes that it will perform well enough that large numbers of people will eventually be motivated to give you money for it. A large, competent, and skeptical group of outside observers (the FDA and other agencies, for starters, and then the broader medical community) will be inspecting your work. You will be filing legal documents to protect it, and if there are major mistakes or gaps in those, motivated competitors and their motivated lawyers will come after you. Even without such problems, those same competitors will be using your compounds in their own assays and projects, in well-funded attempts to do better than you did and persuade people to give them their money instead of giving it to you. If you missed something bad about your drug and it makes it to market anyway, so many lawsuits will come down around you that you’ll think it’s snowing. The whole atmosphere is rather more. . .Darwinian.

So as Barrow mentions, the attitudes and weltanschauungen (OK, he doesn’t use that word) that industrial scientists bring to academia can be rather orthogonal. But that can be a good thing, properly managed. Some people on the academic side might find it annoying, but others will find it invigorating (and that goes for the personal experience of the transitioning industrial researcher, too!) The article is a good intro to what to expect, and how to start arranging things so that you have the greatest chances for success. Remember, everyone in the drug industry came from academia in one form or another (grad students and post-docs at the very least). Having people move from industry back into the universities is a real opportunity that shouldn’t go to waste.

Big Pharma Cuts, Current and Coming

Apr. 22nd, 2019 12:38 pm
[syndicated profile] in_the_pipeline_feed

Posted by Derek Lowe

Word came out just before the weekend (first at Endpoints) that GlaxoSmithKline is laying off R&D employees at both Stevenage (UK) and Upper Providence (US). Current leadership is re-organizing drug discovery efforts to put more emphasis on oncology, immunology and genetic-linked disease, and this moves seems linked to that. Reports are that overall R&D head count is supposed to increase as the strategy takes effect, but (as usual) it’s a lot easier to identify people and positions that don’t fit than it is to hire in people that you think do. John Carroll’s sources told him that chemistry is getting hit particularly hard in this round, actually. The number of re-thinks that GSK has been undergoing over the last few years seems to an outside observer to be impressive, but only in the glad-I’m-not-doing-that sense of the word.

And in other big-company news, you will have seen that it looks like the Bristol-Myers Squibb/Celgene deal is indeed going through after a shareholder vote. Management expects the deal to close in the third quarter of this year, which means, realistically, that productivity at both companies (and especially Celgene) will be taking a hit for all of 2019. Job cuts are surely coming at some point as a result of all this; it’s hard to see how it can be otherwise. But I see that the company’s CEO (Giovanni Caforio) is saying that they’ll be able to launch six new drugs in the next two years, now that they’re revving up to be such a powerhouse and all. But this is what CEOs always say after big mergers and acquisitions, and somehow the turbochargers don’t always seem to quite kick in according to plan. Well, he did throw a “potentially” into that sentence.

So for better or worse, that’s the company’s future. Looking back to its past, though, I’ve heard from ex-Connecticut sources that the main former R&D building in Wallingford is now the subject of a demolition permit. You don’t get much more end-of-an-era than that. But as someone whose first industrial research site is roughly the location of a Home Depot parking lot (and has been for some years now), I can tell you that old lab buildings themselves aren’t as important as the memories you’ve made in them and the things you’ve learned.

(no subject)

Apr. 21st, 2019 08:01 pm
randomdreams: riding up mini slickrock (Default)
[personal profile] randomdreams
I'm reupholstering the Spitfire seats. When I took one apart, amidst the half-wastebasket-quantity pile of broken-down foam, I also found a business card for a place that repaired MG's. The telephone number on the card was a seven-digit number, which probably means it was 1970's. There was also a 1974 penny in the seat.
Getting the seat cover off was a pain. There's a big piece of foam glued to the seat's steel tubing structure. It has a horizontal slit in the middle of it. There is a flap on the vinyl cover that goes through that slit and then clips to the frame further down, to hold the concave surface of the seat back against the foam. The clips are very easy to put on but extremely difficult to remove: they're like binder clips with tiny teeth that grip the fabric, only they have no little wire loops to remove them. They simply press on, but there is no simple way to remove them.
Similarly, the recliner function of the seat is controlled via a lever, and removing it to get the seat covering off was really difficult. Good thing I bought a gear puller years ago.

end of an era

Apr. 20th, 2019 04:16 pm
egypturnash: (HGA)
[personal profile] egypturnash

For most of the time I’ve lived in this apartment, the living room has been dominated by shelves with my fursona painted on them at life-size.

But now they are gone. Starkatt and her friend came by to haul them off to their new home. I am told there is a very high chance these shelves will be filled with the tools and products of a Dicksmithy. I think I am totally fine with my fursona being FILLED WITH DICKS.


Also I just liked looking at the clear spots in the dust that had collected on top of the shelves. A line of electric candles, the base of  a glass swan I inherited from my mom, and a couple vague blobs where some dragon flags and a plushie lived.

It will feel weird to not round the corner and see my dragon self staring back at me. I may have to set up something similar in the new place, whether by painting it on the wall, or on new shelves… we will see. Between this and taking down the canvas print of the luminous white angel-dragon that I had on the inside of the front door, it definitely feels like I really don’t live here any more. The bedroom and kitchen and bathroom still look inhabited but that should change soon.

I was also pretty glad to not find anything lost in the space behind the shelves. There’s like two or three boxes worth of stuff hanging around the living room still, I would like to see myself make a dent in that before bed tonight but getting the last things out of the path between the shelves and the door felt like significant work for the day…

Mirrored from Egypt Urnash.

mmcirvin: (Default)
[personal profile] mmcirvin
 So I gave in and got these packs for the XBox One. They're $20 each, which translates to about $6.50 in 1980 dollars. If you'd told me back in the early 80s that for six and a half bucks I could have a big box of 41 Atari VCS/2600 cartridges AND playable conversions of nine top-notch arcade cabinets (and another completely different collection for another six and a half bucks), I'd have fainted. Even if I knew perfectly well that most of the cartridges weren't really very good ones.

But time marches on. Are they worth $20 each? Eh... mayyybe, if you're nostalgic in the exact way I am. You can easily get playable versions of most or all of these on your computer for free by not strictly legal means, but it's nice having them all legitimately on the XBox. I admit that much of the attraction of Volume II for me was just having a console version of Major Havoc, a surprisingly complex and difficult early platformer/shooter using Atari's Quadrascan color vector hardware. (neb6's YouTube video of Major Havoc below got a comment insisting that somebody had to be lying about the game being from 1983; they couldn't possibly have done that with the primitive technology of the time. It reminded me of ancient-astronaut theorists.)

Some of these games are undone by the control scheme. Most of the paddle games like Pong and Breakout are just no good when played with a spring-centering XBox joystick. The Stella emulator actually does a lot better with these since it can use your computer's trackpad or mouse. Tempest... is actually playable, though it's definitely not as good as with the real rotary controller from the arcade machine. The arcade games made to be played with trackballs, like Missile Command and Atari Force: Liberator, work OK, but again they'd undoubtedly be better with the real thing.

On the other hand, the games with an Asteroids-style rotate-and-thrust scheme, like Asteroids, Asteroids Deluxe, Space Duel and Gravitar, fare surprisingly well. You can use the D-pad if you need fine control, and it actually feels better than the arcade buttons to me. Major Havoc comes across just fine. And Black Widow, another Atari color vector game that probably helped inspire Geometry Wars, plays great on this system: I think modern game controllers are better for twin-stick shooters than just about anything else. (Similarly, Robotron: 2084 is the absolute best conversion on the Midway Arcade Origins disc. Or maybe I just love twin-stick shooters.)

A nice bonus for the 2600 shovelware is that there are scans of all the original manuals included, and they've bothered to include on-screen captions of what the game variations mean. (The early cartridges particularly liked to advertise that they had dozens of different numbered game variations, which were usually the same game with every possible combination of three or four binary gameplay tweaks, in one- and two-player versions. It could make flipping through them a little confusing without reference materials.)

These are basically all the 2600 games they could get that were (a) made by Atari, and (b) lacking in any encumbering external IP license. That means a lot of the best games are missing: no Space Invaders (that was a conversion of a Taito game), no Defender or Stargate (Williams), no Pole Position (Namco), no movie licenses, etc. But I have a weird morbid fascination born of ancient Sears Christmas Wish Books with the crude, blocky 4K games they put out in the late seventies (correction: I'd misremembered; these very earliest games are actually 2K ROMs!!), and they're all here, including the "Miniature Golf" cartridge in which everything is a rectangle, which I have found inexplicably addictive since I was a kid.

They also have the 2600 versions of Missile Command, Asteroids, Centipede and Millipede, but they're somewhat overshadowed by the arcade versions, which are also here. Except that the arcade Missile Command kicks my butt, and I got really good at the 2600 version several decades ago and can still play it for a good long time.

(no subject)

Apr. 18th, 2019 06:37 pm
randomdreams: riding up mini slickrock (Default)
[personal profile] randomdreams
I stayed home sick today because I felt awful, although I can't really say I was contagious or anything.
That gave me time to go to Boulder and get a key for [personal profile] threemeninaboat's bike locker at the train station, the first locker reserved on the open-next-week line, and then physically verify the key's function, while also verifying that the lock does not in fact work on the door latch mechanism they've provided.
I also spent some time installing lock hardware on doors, because several of the doors have never had the right keys in the whole time we've lived here and when someone locks one of those doors it can be a serious hassle to get back through it.
This led to the discovery that there is no door sweep on the door between the garage and the house. There's a 10mm wide gap beneath it. That's about the same size gap between the garage door and the concrete.
No wonder we have a mouse problem.
I never thought to check, because the previous owners had carefully installed double-layer seals all the way around the door, as they did on every other one, to help prevent mice getting in.
It now has a nice custom-sawn piece of redwood that exactly fits in the space. Tomorrow that'll get bolted in place on the bottom of the door, when I have the energy to take the door off its hinges, and I'll put a commercial adjustable sweep behind it to air seal it as well as mouse sealing it.
As I was working on getting the parts together to fix this, I noticed the garage door opener no longer closes correctly.
This house needs some burnin' down.
[syndicated profile] in_the_pipeline_feed

Posted by Derek Lowe

STAT is reporting that IBM has stopped trying to sell their “Watson for Drug Discovery” machine learning/AI tool, according to sources within the company. I have no reason to doubt that – in fact, I’ve sort of been expecting it. But no one seems to have told IBM’s website programming team, because the pages touting the product are still up (at least they are as I write this). They’re worth taking a look at in the light of reality. Gosh, it’s something:

Watson for Drug Discovery reveals connections and relationships among genes, drugs, diseases and other entities by analyzing multiple sets of life sciences knowledge. Researchers can generate new hypotheses using the resulting dynamic visualizations and evidence-backed predictions. . .

. . .Pharmaceutical companies, biotech and academic institutions use Watson for Drug Discovery to assist with new drug target identification and drug repurposing. Connect your in-house data with public data for a rich set of life sciences knowledge. Shorten the drug discovery process and increase the likelihood of your scientific breakthroughs.

Well, no, apparently they don’t use it much, because no one seems to have felt that they were increasing the likelihood of any scientific breakthroughs. The IBM pages are rather long on quotes from the Barrow Neurological Institute, about how they can make such breakthroughs “in a fraction of the time and cost”, but it looks like they’re going to have to get along without the product unless IBM is providing support to legacy customers. And since the STAT piece says that they’re halting both development and sale, that seems unlikely. Barrow and IBM press-released some results in late 2016, and there’s a promotional video from a month or two later, but that was both the first and last announcement from that collaboration.

What happened? Reality. As this IEEE Spectrum article from earlier this month shows in detail, IBM’s entire foray into health care has been marked by the familiar combination of overpromising and underdelivery. To their credit, the company made a very early push into the area (2011 !) with a lot of people and a lot of money. Unfortunately, they also made sure that everyone knew that they were doing it, and what a big, big deal it all was.

The day after Watson thoroughly defeated two human champions in the game of Jeopardy!, IBM announced a new career path for its AI quiz-show winner: It would become an AI doctor. IBM would take the breakthrough technology it showed off on television—mainly, the ability to understand natural language—and apply it to medicine. Watson’s first commercial offerings for health care would be available in 18 to 24 months, the company promised.

In fact, the projects that IBM announced that first day did not yield commercial products. In the eight years since, IBM has trumpeted many more high-profile efforts to develop AI-powered medical technology—many of which have fizzled, and a few of which have failed spectacularly. 

Watson for Drug Discovery is just one of that suite of tools (well, potential tools). The idea was that it would go ripping through the medical literature, genomics databases, and your in-house data collection, finding correlations and clues that humans had missed. There’s nothing wrong with that as an aspirational goal. In fact, that’s what people eventually expect out of machine learning approaches, but a key word in that sentence is “eventually”. IBM, though, specifically sold the system as being ready to use for target identification, pathway elucidation, prediction of gene and protein function and regulation, drug repurposing, and so on. And it just wasn’t ready for those challenges, especially as early as they were announcing that they were. I first wrote about the company’s foray into drug discovery in 2013, and you’ll note that nothing really came out of the GSK/IBM work mentioned in that post. To the best of my knowledge, the two companies never really collaborated on drug discovery at all, but hey: they did team up on more targeted ways to advertise flu medicine.

Meanwhile, attempts at diagnostic and drug-therapy recommendations in oncology have been not only unproductive, but (according again to earlier reporting at STAT) even worse than that. The Spectrum article linked above goes into more details on those and other efforts all over the health care area that have come to naught, along with a few limited successes. And oddly enough, I’m going to finish off thinking about those. I still believe that machine learning is a perfectly good idea, with potential applications all over the field. But it ain’t magic. The areas where it’s worked the best so far are the ones with well-defined outcome sets based on large and very well-curated data collections, and where people have not been expecting the software to start spitting out golden insights and breakthrough proposals. It’ll get better – with a lot of work.

Just because people tried to sell the world on the idea that we’d moved past that stage years ago (A) does not make that so but (B) does not mean that we’ll never move past that stage at all. Next week I’ll have a post about machine learning and AI that goes into the real state of the field, from practitioners who have been spending their time whacking away at the code rather than generating promotional videos. IBM, though, has so far been doing the entire field a disservice with the way that they’ve spent too much time on the latter and not enough on the former.

Down To the Single Cells

Apr. 17th, 2019 01:48 pm
[syndicated profile] in_the_pipeline_feed

Posted by Derek Lowe

This is a good brief overview of a topic that’s becoming more important all the time: analysis on the single-cell level. And as the authors mention, it’s partly a case of wanting to do this, and partly a case of there being no other choice. Larger pooled tissue samples just don’t have the level of detail needed: you average out the numbers for the analytes you can detect, and follow that up with a much greater chance of missing the rare ones (or the values present in rare cell types, which get swamped out). And it’s just those variations that we so often need to focus on.

But single-cell work has (and most certainly still does) push the limits of analytical chemistry. The math is not encouraging. If you have an analytical technique that can detect things down to one femtomole, that’s pretty good, right? But a typical mammalian cell is a one picoliter container, which means that your one-femtomole sensitivity will limit you to millimolar concentrations inside that one cell – in other words, just the major stuff. Then there’s the question of spatial resolution and intracellular compartments. Lysing said cell and blending it up just gives you a smaller-scale version of the averaging problem you were trying to avoid by moving to single-cell analysis in the first place: what if a given species is much more important in the nucleus or mitochondrion than out in the cytosol? What do you do about membrane-bound components? And finally, there are the problems of finding and isolating particular cells of interest. If you’re looking (as you may well be) for rare cell types, like ones that have just flipped a switch to become cancerous, how do you find them? Many of the techniques used to identify cell types involve staining with small-molecule or immunologic reagents that aren’t compatible with the analysis you’re setting up for. Even with the more normal cell varieties, you have to be concerned with where they are in the cell cycle and their metabolic state, while being aware that you might be altering that state just through the isolation process. No, it ain’t easy.

Fortunately, it’s not impossible, either. The problems vary, though, with what you’re trying to measure. Single-cell genetic sequencing is probably the most robust measurement, thanks to advances in sequencing in general, and of course the amplification possible through PCR. Single-cell proteomics and small-molecule metabolomics, etc., are another thing entirely. Every technique available has sensitivity limits and blind spots even while working within those, and it’s important to keep both of those in mind. As the review details, a good number of these techniques are mass-spec-plus-something-else, as you might well imagine, since that’s the technique that combines the overall versatility, sensitivity, and throughput needed to make such experiments work at all. For physical separation of analytes, microscale capillary electrophoresis seems to be the leading technique.

Even now, though, there are some interesting results showing up. Bulk homogenate mass spec measurements, for one, tend to pick up a lot more glutamate and glutathione than you see in single-cell measurements. It could be that these components are sitting in the extracellular matrix and lost during single-cell preparation, or that they’re increased under the stress of that preparation, or that they’re broken out and released from other conjugated species more under some kinds of handling than others. (These aren’t mutually exclusive explanations, either).

One very difficult but very promising area combines imaging with chemical analysis, as in surface-mass-spec scanning techniques. Doing this at the resolution of cellular structures is really pushing up to the edge of what’s even possible at the moment, but things are continuing to improve, and we’re not bumping up against any laws of physics yet (or not quite). There are a lot of different ionization techniques out there, with many more to try, and already you can see real improvements in both the range of ions kicked off the sample, the amount of depth (and depth information) you can get from the beams themselves, and the number of chemical species that can be detected and identified. (That identification is nontrivial, too – as these methods improve, you find many examples of interesting molecular ions being detected that no one is quite sure how to assign yet). This really is the absolute forefront of analytic instrumentation, with combinations of mass spec and advanced imaging and labeling techniques pushing the field forward. The high spatial resolution requirements, unusual ion beam characteristics, and added simultaneous instruments (such as Raman probes or atomic force microscope tips) mean that these setups are bespoke custom rigs at the moment, the wires-hanging-everywhere look that tells you that you’re seeing one-of-a-kind instrumentation.

I’m very happy to see it. The ability to track both endogenous species and outside ones (such as our drug molecules!) inside single cells is going to be crucial to the overall program of understanding what the heck is going on in a real living system. Honestly, I can’t see any other way to do it. These techniques are simultaneously a reductionist approach (down to individual chemical species) and an integrative one (since you’re observing the real living system in all its complexity), and that’s just the kind of blend that offers the most information. Figuring out how to obtain more such data and (even more) figuring out how to interpret all of it will occupy a lot of people for a long time, but to very good effect.

Page generated Apr. 26th, 2019 06:14 am
Powered by Dreamwidth Studios