ng

Health boards say around half of pharmacies have expressed interest in providing COVID-19 vaccines

Around half of Wales’ community pharmacies have expressed interest to health boards in providing COVID-19 vaccinations as part of the national programme.




ng

NHS England lowers threshold for COVID-19 vaccination site applications

Community pharmacies able to administer up to 400 COVID-19 vaccines per week can now apply to become designated vaccination sites, NHS England has said.




ng

Chiesi launches postal asthma inhaler recycling scheme

The UK’s first postal inhaler recycling scheme has been launched by pharmaceutical company Chiesi to support a more sustainable way of living for people with respiratory illnesses.




ng

Half of asthma patients in the UK overusing SABAs, study finds

More than half of patients with asthma in the UK are “potentially overusing” short-acting β2-agonists, according to research.




ng

RPS pays tribute to pharmacy law and ethics pioneer Joy Wingfield

The Royal Pharmaceutical Society has expressed its sadness at the death of Joy Wingfield, honorary professor of Pharmacy Law and Ethics at the University of Nottingham.




ng

Lessons From A Private Funding Round: Science, Relationships, And Experience

By Mike Cloonan, CEO of Sionna Therapeutics, as part of the From The Trenches feature of LifeSciVC An insightful piece on this blog following the JPM healthcare conference noted the “refreshing burst of enthusiasm” in the biotech sector. It’s true

The post Lessons From A Private Funding Round: Science, Relationships, And Experience appeared first on LifeSciVC.




ng

Deconstructing the Diligence Process: An Approach to Vetting New Product Theses

By Aimee Raleigh, Principal at Atlas Venture, as part of the From The Trenches feature of LifeSciVC Ever wondered what goes into diligencing a new idea, program, company, or platform? While each diligence is unique and every investor will have

The post Deconstructing the Diligence Process: An Approach to Vetting New Product Theses appeared first on LifeSciVC.




ng

The Biotech Startup Contraction Continues… And That’s A Good Thing

Venture creation in biotech is witnessing a sustained contraction. After the pandemic bubble’s over-indulgence, the venture ecosystem appears to have reset its pace of launching new startups. According to the latest Pitchbook data, venture creation in biotech hit its slowest

The post The Biotech Startup Contraction Continues… And That’s A Good Thing appeared first on LifeSciVC.




ng

Has Spring Sprouted New Growth in Immuno-Oncology?

By Jonathan Montagu, CEO of HotSpot Therapeutics, as part of the From The Trenches feature of LifeSciVC As Boston’s weather has started its turn from the frigid darkness that is a northeast winter to the longer days and lighter conditions

The post Has Spring Sprouted New Growth in Immuno-Oncology? appeared first on LifeSciVC.




ng

Boiling It Down: Conveying Complexity For Decision-makers

By Ankit Mahadevia, former CEO of Spero Therapeutics, as part of the From The Trenches feature of LifeSciVC Drug development is complex. So is running a business. Sometimes, the work of doing both can make your head spin. In my

The post Boiling It Down: Conveying Complexity For Decision-makers appeared first on LifeSciVC.




ng

Looking for Opportunities to Accelerate Clinical Research in Rare Diseases

By Mike Cloonan, Chief Executive Officer of Sionna Therapeutics, as part of the From The Trenches feature of LifeSciVC The drug development process in rare diseases is rife with challenges especially when companies target significant differentiation or first-in-class targets. Identifying

The post Looking for Opportunities to Accelerate Clinical Research in Rare Diseases appeared first on LifeSciVC.




ng

Keeping It Simple: What Really Matters For Emerging Enterprises  

By Ankit Mahadevia, chairman of Spero Therapeutics, as part of the From The Trenches feature of LifeSciVC A common theme in startup literature is that by cutting a range of unnecessary tasks, a step-change in results will follow.  I’ve found

The post Keeping It Simple: What Really Matters For Emerging Enterprises   appeared first on LifeSciVC.




ng

AllTrials guide to asking academic institutions about missing results

When university and hospital trusts were called to the UK parliament last year to answer questions on why they were not following the rules on reporting results, we saw how effective the questioning from politicians was. Those of you who watched the parliamentary session saw the pressure the university representatives were put under. Because the politicians asked […]




ng

Half of US clinical trials are breaking the law on reporting results

New research has shown that the majority of clinical trials which should be following the US law on reporting results aren’t. Less than half (41%) of clinical trial results were reported on time and 1 in 3 trials (36%) remain unreported. The research also found that clinical trials sponsored by companies are the most likely […]




ng

Hundreds of clinical trials ruled to be breaking the law

A judge in New York has ruled that hundreds of clinical trials registered on ClinicalTrials.gov are breaking the law by not reporting results. The ruling came in a court case launched against the US Department of Health and Human Services by two plaintiffs, a family doctor and a professor of journalism. The case focused on […]




ng

Preview of Enrollment Analytics: Moving Beyond the Funnel (Shameless DIA Self-Promotion, Part 2)


Are we looking at our enrollment data in the right way?


I will be chairing a session on Tuesday on this topic, joined by a couple of great presenters (Diana Chung from Gilead and Gretchen Goller from PRA).

Here's a short preview of the session:



Hope to see you there. It should be a great discussion.

Session Details:

June 25, 1:45PM - 3:15PM

  • Session Number: 241
  • Room Number: 205B


1. Enrollment Analytics: Moving Beyond the Funnel
Paul Ivsin
VP, Consulting Director
CAHG Clinical Trials

2. Use of Analytics for Operational Planning
Diana Chung, MSc
Associate Director, Clinical Operations
Gilead

3. Using Enrollment Data to Communicate Effectively with Sites
Gretchen Goller, MA
Senior Director, Patient Access and Retention Services
PRA





ng

Brazen Scofflaws? Are Pharma Companies Really Completely Ignoring FDAAA?

Results reporting requirements are pretty clear. Maybe critics should re-check their methods?

Ben Goldacre has rather famously described the clinical trial reporting requirements in the Food and Drug Administration Amendments Act of 2007 as a “fake fix” that was being thoroughly “ignored” by the pharmaceutical industry.

Pharma: breaking the law in broad daylight?
He makes this sweeping, unconditional proclamation about the industry and its regulators on the basis of  a single study in the BMJ, blithely ignoring the fact that a) the authors of the study admitted that they could not adequately determine the number of studies that were meeting FDAAA requirements and b) a subsequent FDA review that identified only 15 trials potentially out of compliance, out of a pool of thousands.


Despite the fact that the FDA, which has access to more data, says that only a tiny fraction of studies are potentially noncompliant, Goldacre's frequently repeated claims that the law is being ignored seems to have caught on in the general run of journalistic and academic discussions about FDAAA.

And now there appears to be additional support for the idea that a large percentage of studies are noncompliant with FDAAA results reporting requirements, in the form of a new study in the Journal of Clinical Oncology: "Public Availability of Results of Trials Assessing Cancer Drugs in the United States" by Thi-Anh-Hoa Nguyen, et al.. In it, the authors report even lower levels of FDAAA compliance – a mere 20% of randomized clinical trials met requirements of posting results on clinicaltrials.gov within one year.

Unsurprisingly, the JCO results were immediately picked up and circulated uncritically by the usual suspects.

I have to admit not knowing much about pure academic and cooperative group trial operations, but I do know a lot about industry-run trials – simply put, I find the data as presented in the JCO study impossible to believe. Everyone I work with in pharma trials is painfully aware of the regulatory environment they work in. FDAAA compliance is a given, a no-brainer: large internal legal and compliance teams are everywhere, ensuring that the letter of the law is followed in clinical trial conduct. If anything, pharma sponsors are twitchily over-compliant with these kinds of regulations (for example, most still adhere to 100% verification of source documentation – sending monitors to physically examine every single record of every single enrolled patient - even after the FDA explicitly told them they didn't have to).

I realize that’s anecdotal evidence, but when such behavior is so pervasive, it’s difficult to buy into data that says it’s not happening at all. The idea that all pharmaceutical companies are ignoring a highly visible law that’s been on the books for 6 years is extraordinary. Are they really so brazenly breaking the rules? And is FDA abetting them by disseminating incorrect information?

Those are extraordinary claims, and would seem to require extraordinary evidence. The BMJ study had clear limitations that make its implications entirely unclear. Is the JCO article any better?

Some Issues


In fact, there appear to be at least two major issues that may have seriously compromised the JCO findings:

1. Studies that were certified as being eligible for delayed reporting requirements, but do not have their certification date listed.

The study authors make what I believe to be a completely unwarranted assumption:

In trials for approval of new drugs or approval for a new indication, a certification [permitting delayed results reporting] should be posted within 1 year and should be publicly available.

It’s unclear to me why the authors think the certifications “should be” publicly available. In re-reading FDAAA section 801, I don’t see any reference to that being a requirement. I suppose I could have missed it, but the authors provide a citation to a page that clearly does not list any such requirement.

But their methodology assumes that all trials that have a certification will have it posted:

If no results were posted at ClinicalTrials.gov, we determined whether the responsible party submitted a certification. In this case, we recorded the date of submission of the certification to ClinicalTrials.gov.

If a sponsor gets approval from FDA to delay reporting (as is routine for all drugs that are either not approved for any indication, or being studied for a new indication – i.e., the overwhelming majority of pharma drug trials), but doesn't post that approval on the registry, the JCO authors deem that trial “noncompliant”. This is not warranted: the company may have simply chosen not to post the certification despite being entirely FDAAA compliant.

2. Studies that were previously certified for delayed reporting and subsequently reported results

It is hard to tell how the authors treated this rather-substantial category of trials. If a trial was certified for delayed results reporting, but then subsequently published results, the certification date becomes difficult to find. Indeed, it appears in the case where there were results, the authors simply looked at the time from study completion to results posting. In effect, this would re-classify almost every single one of these trials from compliant to non-compliant. Consider this example trial:


  • Phase 3 trial completes January 2010
  • Certification of delayed results obtained December 2010 (compliant)
  • FDA approval June 2013
  • Results posted July 2013 (compliant)


In looking at the JCO paper's methods section, it really appears that this trial would be classified as reporting results 3.5 years after completion, and therefore be considered noncompliant with FDAAA. In fact, this trial is entirely kosher, and would be extremely typical for many phase 2 and 3 trials in industry.

Time for Some Data Transparency


The above two concerns may, in fact, be non-issues. They certainly appear to be implied in the JCO paper, but the wording isn't terribly detailed and could easily be giving me the wrong impression.

However, if either or both of these issues are real, they may affect the vast majority of "noncompliant" trials in this study. Given the fact that most clinical trials are either looking at new drugs, or looking at new indications for new drugs, these two issues may entirely explain the gap between the JCO study and the unequivocal FDA statements that contradict it.

I hope that, given the importance of transparency in research, the authors will be willing to post their data set publicly so that others can review their assumptions and independently verify their conclusions. It would be more than a bit ironic otherwise.

[Image credit: Shamless lawlessness via Flikr user willytronics.]


Thi-Anh-Hoa Nguyen, Agnes Dechartres, Soraya Belgherbi, and Philippe Ravaud (2013). Public Availability of Results of Trials Assessing Cancer Drugs in the United States JOURNAL OF CLINICAL ONCOLOGY DOI: 10.1200/JCO.2012.46.9577




ng

Brave New Biopharm Blogging

Although a few articles on this site are older, I really only began blogging in earnest about 15 months ago. However, I suppose that's long enough that I can count myself as at least somewhat established, and take a moment to welcome and encourage some interesting newcomers to the scene.
 
Bloggers in dank basements their natural habitat.
There are 3 relative newcomers that I've found really interesting, all with very different perspectives on drug development and clinical research:


The Big Pharma insider.
With the exception of John LaMattina (the former Pfizer exec who regularly provides seriously thought provoking ideas over on Forbes), I don’t know of anyone from the ranks of Big Pharma who writes both consistently and well. Which is a shame, given how many major past, current, and future therapies pass through those halls.

Enter Frank David, the Director of Strategy at AstraZeneca's Oncology Innovative Medicines unit. Frank started his Pharmagellan blog this April, and has been putting out a couple thoughtful perspective pieces a month since then.

Frank also gets my vote for most under-followed Twitter account in the industry, as he’s putting out a steady stream of interesting material.


Getting trials done.
Clinical operations – the actual execution of the clinical trials we all talk about – is seriously underrepresented in the blogosphere. There are a number of industry blogs, but none that aren’t trying first and foremost to sell you something.

I met Nadia Bracken on my last trip out to the San Francisco bay area. To say Nadia is driven is to make a rather silly understatement. Nadia is driven. She thinks fast and she talks fast. ClinOps Toolkit is a blog (or resource? or community?) that is still very much in development, but I think it holds a tremendous amount of potential. People working in ClinOps should be embracing her, and those of us who depend on operations teams getting the job done should keep a close eye on the website.


Watching the money.
I am not a stock trader. I am a data person, and data says trust big sample sizes. And, honestly, I just don't have the time.

But that doesn't stop me from realizing that a lot of great insight about drug development – especially when it concerns small biotechs – is coming from the investment community. So I tend to follow a number of financial writers, as I've found that they do a much better job of digging through the hype than can ever be expected of the mainstream media.

One stock writer who I've been following for a while is Andrew Goodwin, who maintains the Biotech Due Diligence website and blog. Andrew clearly has a great grasp on a number of topics, so when he described a new blog as a “must-have addition” to one's reading list, I had to take a look.

And the brand-new-this-month blog, by David Sable at Special Situations Fund, does seem like a great read. David looks both at the corporate dynamics and scientific stories of biotechs with a firmly skeptical view. I know most blogs this new will not be around 6 months from now (and David admits as much in his opening post), but I’m hoping this one lasts.

. . . . .

So, I encourage you to take a look at the above 3 blogs. I'm happy to see more and diverse perspectives on the drug development process starting to emerge, and hope that all 3 of these authors stick around for quite a while – we need their ideas.



[Bloggerhole photo courtesy of Flikr user second_mouse.]




ng

Patient Recruitment: Taking the Low Road

The Wall Street Journal has an interesting article on the use of “Big Data” to identify and solicit potential clinical trial participants. The premise is that large consumer data aggregators like Experian can target patients with certain diseases through correlations with non-health behavior. Examples given include “a preference for jazz” being associated with arthritis and “shopping online for clothes” being an indicator of obesity.

We've seen this story before.

In this way, allegedly, clinical trial patient recruitment companies can more narrowly target their solicitations* for patients to enroll in clinical trials.

In the spirit of full disclosure, I should mention that I was interviewed by the reporter of this article, although I am not quoted. My comments generally ran along three lines, none of which really fit in with the main storyline of the article:

  1. I am highly skeptical that these analyses are actually effective at locating patients
  2. These methods aren't really new – they’re the same tactics that direct marketers have been using for years
  3. Most importantly, the clinical trials community can – and should – be moving towards open and collaborative patient engagement. Relying on tactics like consumer data snooping and telemarketing is an enormous step backwards.

The first point is this: certainly some diseases have correlates in the real world, but these correlates tend to be pretty weak, and are therefore unreliable predictors of disease. Maybe it’s true that those struggling with obesity tend to buy more clothes online (I don’t know if it’s true or not – honestly it sounds a bit more like an association built on easy stereotypes than on hard data). But many obese people will not shop online (they will want to be sure the clothes actually fit), and vast numbers of people with low or average BMIs will shop for clothes online.  So the consumer data will tend to have very low predictive value. The claims that liking jazz and owning cats are predictive of having arthritis are even more tenuous. These correlates are going to be several times weaker than basic demographic information like age and gender. And for more complex conditions, these associations fall apart.

Marketers claim to solve this by factoring a complex web of associations through a magical black box – th WSJ article mentions that they “applied a computed algorithm” to flag patients. Having seen behind the curtain on a few of these magic algorithms, I can confidently say that they are underwhelming in their sophistication. Hand-wavy references to Big Data and Algorithms are just the tools used to impress pharma clients. (The down side to that, of course, is that you can’t help but come across as big brotherish – see this coverage from Forbes for a taste of what happens when people accept these claims uncritically.)

But the effectiveness of these data slice-n-dicing activities is perhaps beside the point. They are really just a thin cover for old-fashioned boiler room tactics: direct mail and telemarketing. When I got my first introduction to direct marketing in the 90’s, it was the exact same program – get lead lists from big companies like Experian, then aggressively mail and call until you get a response.

The limited effectiveness and old-school aggressiveness of these programs comes is nicely illustrated in the article by one person’s experience:
Larna Godsey, of Wichita, Kan., says she received a dozen phone calls about a diabetes drug study over the past year from a company that didn't identify itself. Ms. Godsey, 63, doesn't suffer from the disease, but she has researched it on the Internet and donated to diabetes-related causes. "I don't know if it's just a coincidence or if they're somehow getting my information," says Ms. Godsey, who filed a complaint with the FTC this year.
The article notes that one recruitment company, Acurian, has been the subject of over 500 FTC complaints regarding its tactics. It’s clear that Big Data is just the latest buzzword lipstick on the telemarketing pig. And that’s the real shame of it.

We have arrived at an unprecedented opportunity for patients, researchers, and private industry to come together and discuss, as equals, research priorities and goals. Online patient communities like Inspire and PatientsLikeMe have created new mechanisms to share clinical trial opportunities and even create new studies. Dedicated disease advocates have jumped right into the world of clinical research, with groups like the Cystic Fibrosis Foundation and Michael J. Fox Foundation no longer content with raising research funds, but actively leading the design and operations of new studies.

Some – not yet enough – pharmaceutical companies have embraced the opportunity to work more openly and honestly with patient groups. The scandal of stories like this is not the Wizard of Oz histrionics of secret computer algorithms, but that we as an industry continue to take the low road and resort to questionable boiler room tactics.

It’s past time for the entire patient recruitment industry to drop the sleaze and move into the 21st century. I would hope that patient groups and researchers will come together as well to vigorously oppose these kinds of tactics when they encounter them.

(*According to the article, Acurian "has said that calls related to medical studies aren't advertisements as defined by law," so we can agree to call them "solicitations".)




ng

The Coming of the MOOCT?

Big online studies, in search of millions of participants.

Back in September, I enrolled in the Heath eHeart Study - an entirely online research study tracking cardiac health. (Think Framingham Heart, cast wider and shallower - less intensive follow-up, but spread out to the entire country.)


[In the spirit of full disclosure, I should note that I haven’t completed any follow-up activities on the Heath eHeart website yet. Yes, I am officially part of the research adherence problem…]


Yesterday, I learned of the Quantified Diet Project, an entirely online/mobile app-supported randomized trial of 10 different weight loss regimens. The intervention is short - only 4 weeks - but that’s probably substantially longer than most New Year diets manage to last, and should be just long enough to detect some early differences among the approaches.


I have been excited about the potential for online medical research for quite some time. For me, the real beginning was when PatientsLikeMe published the results of their online lithium for ALS research study - as I wrote at the time, I have never been so enthused about a negative trial before or since.



That was two and a half years ago, and there hasn't been a ton of activity since then outside of PatientsLikeMe (who have expanded and formalized their activities in the Open Research Exchange). So I’m eager to hear how these two new studies go. There are some interesting similarities and differences:


  • Both are university/private collaborations, and both (perhaps unsurprisingly) are rooted in California: Heath eHeart is jointly run by UCSF and the American Heart Association, while Quantified Diet is run by app developer Lift with scientific support from a (unidentified?) team at Berkeley.
  • Both are pushing for a million or more participants, dwarfing even very large traditional studies by orders of magnitude.
  • Health eHeart is entirely observational, and researchers will have the ability to request its data to test their own hypotheses, whereas Quantified Diet is a controlled, randomized trial.


Data entry screen on Health eHeart
I really like the user interface for Heath eHeart - it’s extremely simple, with a logical flow to the sections. It clearly appears to be designed for older participants, and the extensive data intake is subdivided into a large number of subsections, each of which can typically be completed in 2-4 minutes.



I have not enrolled into the Quantified Diet, but it appears to have a strong social media presence. You can follow the Twitter conversation through the #quantdiet hashtag. The semantic web and linked data guru Kerstin Forsberg has already posted about joining, and I hope to hear more from her and from clinical trial social media expert Rahlyn Gossen, who’s also joined.


To me, probably the most intriguing technical feature of the QuantDiet study is its “voluntary randomization” design. Participants can self-select into the diet of their choice, or can choose to be randomly assigned by the application. It will be interesting to see whether any differences emerge between the participants who chose a particular arm and those who were randomized into that arm - how much does a person’s preference matter?


In an earlier tweet I asked, “is this a MOOCT?” - short for Massive Open Online Clinical Trial. I don’t know if that’s the best name for it, and l’d love to hear other suggestions. By any other name, however, these are still great initiatives and I look forward to seeing them thrive in the coming years.

The implications for pharmaceutical and medical device companies is still unclear. Pfizer's jump into world of "virtual trials" was a major bust, and widely second-guessed. I believe there is definitely a role and a path forward here, and these big efforts may teach us a lot about how patients want to be engaged online.




ng

Megafund versus Megalosaurus: Funding Drug Development


This new 10-minute TEDMED talk is getting quite a bit of attention:


 (if embedded video does not work, try the TED site itself.)

In it, Roger Stein claims to have created an approach to advancing drugs through clinical trials that will "fundamentally change the way research for cancer and lots of other things gets done".

Because the costs of bringing a drug to market are so high, time from discovery to marketing is so long, and the chances of success of any individual drug are so grim, betting on any individual drug is foolish, according to Stein. Instead, risks for a large number of potential assets should be pooled, with the eventual winners paying for the losers.

To do this, Stein proposes what he calls a "megafund" - a large collection of assets (candidate therapies). Through some modeling and simulations, Stein suggests some of the qualities of an ideal megafund: it would need in the neighborhood of $3-15 billion to acquire and manage 80-150 drugs. A fund of this size and with these assets would be able to provide an equity yield of about 12%, which would be "right in the investment sweet spot of pension funds and 401(k) plans".

Here's what I find striking about those numbers: let's compare Stein's Megafund to everyone's favorite Megalosaurus, the old-fashioned Big Pharma dinosaur sometimes known as Pfizer:


Megafund
(Stein)
Megalosaurus
(Pfizer)
Funding
$3-15 billion
$9 billion estimated 2013 R&D spend
Assets
80-150
81 (in pipeline, plus many more in preclinical)
Return on Equity
12% (estimated)
9.2% (last 10 years) to 13.2% (last 5)
Since Pfizer's a dinosaur, it can't possibly compete with
the sleek, modern Megafund, right? Right?

These numbers look remarkably similar. Pfizer - and a number of its peers - are spending Megafund-sized budget each year to shepherd through a Megafund-sized number of compounds. (Note many of Pfizer's peers have substantially fewer drugs in their published pipelines, but they own many times more compounds - the pipeline is just the drugs what they've elected to file an IND on.)

What am I missing here? I understand that a fund is not a company, and there may be some benefits to decoupling asset management decisions from actual operations, but this won't be a tremendous gain, and would presumably be at least partially offset by increased transaction costs (Megafund has to source, contract, manage, and audit vendors to design and run all its trials, after all, and I don't know why I'd think it could do that any more cheaply than Big Pharma can). And having a giant drug pipeline's go/no go decisions made by "financial engineers" rather than pharma industry folks would seem like a scenario that's only really seen as an upgrade by the financial engineers themselves.

A tweet from V.S. Schulz pointed me to a post on Derek Lowe's In the Pipeline blog. which lead to a link to this paper by Stein and 2 others in Nature Biotechnology from a year and a half ago. The authors spend most of their time differentiating themselves from other structures in the technical, financial details rather than explaining why megafund would work better at finding new drugs. However, they definitely think this is qualitatively different from existing pharma companies, and offer a couple reasons. First,
[D]ebt financing can be structured to be more “patient” than private or public equity by specifying longer maturities; 10- to 20-year maturities are not atypical for corporate bonds. ... Such long horizons contrast sharply with the considerably shorter horizons of venture capitalists, and the even shorter quarterly earnings cycle and intra-daily price fluctuations faced by public companies.
I'm not sure where this line of though is coming from. Certainly all big pharma companies' plans extend decades into the future - there may be quarterly earnings reports to file, but that's a force exerted far more on sales and marketing teams than on drug development. The financing of pharmaceutical development is already extremely long term.

Even in the venture-backed world, Stein and team are wrong if they believe there is pervasive pressure to magically deliver drugs in record time. Investors and biotech management are both keenly aware of the tradeoffs between speed and regulatory success. Even this week's came-from-nowhere Cinderella story, Intercept Pharmaceuticals, was founded with venture money over a decade ago - these "longer maturities" are standard issue in biotech. We aren't making iPhone apps here, guys.

Second,
Although big pharma companies are central to the later stages of drug development and the marketing and distributing of approved drugs, they do not currently play as active a role at the riskier preclinical and early stages of development
Again, I'm unsure why this is supposed to be so. Of Pfizer's 81 pipeline compounds, 55 are in Phase 1 or 2 - a ratio that's pretty heavy on early, risky project, and that's not too different from industry as a whole. Pfizer does not publish data on the number of compounds it currently has undergoing preclinical testing, but there's no clear reason I can think of to assume it's a small number.

So, is Megafund truly a revolutionary idea, or is it basically a mathematical deck-chair-rearrangement for the "efficiencies of scale" behemoths we've already got?

[Image: the world's first known dino, Megalosaurus, via Wikipedia.]




ng

Waiver of Informed Consent - proposed changes in the 21st Century Cures Act

Adam Feuerstein points out - and expresses considerable alarm over - an overlooked clause in the 21st Century Cures Act:


In another tweet, he suggests that the act will "decimate" informed consent in drug trials. Subsequent responses and retweets  did nothing to clarify the situation, and if anything tended to spread, rather than address, Feuerstein's confusion.

Below is a quick recap of the current regulatory context and a real-life example of where the new wording may be helpful. In short, though, I think it's safe to say:


  1. Waiving informed consent is not new; it's already permitted under current regs
  2. The standards for obtaining a waiver of consent are stringent
  3. They may, in fact, be too stringent in a small number of situations
  4. The act may, in fact, be helpful in those situations
  5. Feuerstein may, in fact, need to chill out a little bit


(For the purposes of this discussion, I’m talking about drug trials, but I believe the device trial situation is parallel.)

Section 505(i) - the section this act proposes to amend - instructs the Secretary of Health and Human Services to propagate rules regarding clinical research. Subsection 4 addresses informed consent:

…the manufacturer, or the sponsor of the investigation, require[e] that experts using such drugs for investigational purposes certify to such manufacturer or sponsor that they will inform any human beings to whom such drugs, or any controls used in connection therewith, are being administered, or their representatives, that such drugs are being used for investigational purposes and will obtain the consent of such human beings or their representatives, except where it is not feasible or it is contrary to the best interests of such human beings.

[emphasis  mine]

Note that this section already recognizes situations where informed consent may be waived for practical or ethical reasons.

These rules were in fact promulgated under 45 CFR part 46, section 116. The relevant bit – as far as this conversation goes – regards circumstances under which informed consent might be fully or partially waived. Specifically, there are 4 criteria, all of which need to be met:

 (1) The research involves no more than minimal risk to the subjects;
 (2) The waiver or alteration will not adversely affect the rights and welfare of the subjects;
 (3) The research could not practicably be carried out without the waiver or alteration; and
 (4) Whenever appropriate, the subjects will be provided with additional pertinent information after participation.

In practice, this is an especially difficult set of criteria to meet for most studies. Criterion (1) rules out most “conventional” clinical trials, because the hallmarks of those trials (use of an investigational medicine, randomization of treatment, blinding of treatment allocation) are all deemed to be more than “minimal risk”. That leaves observational studies – but even many of these cannot clear the bar of criterion (3).

That word “practicably” is a doozy.

Here’s an all-too-real example from recent personal experience. A drug manufacturer wants to understand physicians’ rationales for performing a certain procedure. It seems – but there is little hard data – that a lot of physicians do not strictly follow guidelines on when to perform the procedure. So we devise a study: whenever the procedure is performed, we ask the physician to complete a quick form categorizing why they made their decision. We also ask him or her to transcribe a few pieces of data from the patient chart.

Even though the patients aren’t personally identifiable, the collection of medical data qualifies this as a clinical trial.

It’s a minimal risk trial, definitely: the trial doesn’t dictate at all what the doctor should do, it just asks him or her to record what they did and why, and supply a bit of medical context for the decision. All told, we estimated 15 minutes of physician time to complete the form.

The IRB monitoring the trial, however, denied our request for a waiver of informed consent, since it was “practicable” (not easy, but possible) to obtain informed consent from the patient.  Informed consent – even with a slimmed-down form – was going to take a minimum of 30 minutes, so the length of the physician’s involvement tripled. In addition, many physicians opted out of the trial because they felt that the informed consent process added unnecessary anxiety and alarm for their patients, and provided no corresponding benefit.

The end result was not surprising: the budget for the trial more than doubled, and enrollment was far below expectations.

Which leads to two questions:

1.       Did the informed consent appreciably help a single patient in the trial? Very arguably, no. Consenting to being “in” the trial made zero difference in the patients’ care, added time to their stay in the clinic, and possibly added to their anxiety.
2.       Was less knowledge collected as a result? Absolutely, yes. The sponsor could have run two studies for the same cost. Instead, they ultimately reduced the power of the trial in order to cut losses.


Bottom line, it appears that the modifications proposed in the 21st Century Cures Act really only targets trials like the one in the example. The language clearly retains criteria 1 and 2 of the current HHS regs, which are the most important from a patient safety perspective, but cuts down the “practicability” requirement, potentially permitting high quality studies to be run with less time and cost.

Ultimately, it looks like a very small, but positive, change to the current rules.

The rest of the act appears to be a mash-up of some very good and some very bad (or at least not fully thought out) ideas. However, this clause should not be cause for alarm.




ng

Establishing efficacy - without humans?

The decade following passage of FDAAA has been one of easing standards for drug approvals in the US, most notably with the advent of “breakthrough” designation created by FDASIA in 2012 and the 21st Century Cures Act in 2016.

Although, as of this writing, there is no nominee for FDA Commissioner, it appears to be safe to say that the current administration intends to accelerate the pace of deregulation, mostly through further lowering of approval requirements. In fact, some of the leading contenders for the position are on record as supporting a return to pre-Kefauver-Harris days, when drug efficacy was not even considered for approval.

Build a better mouse model, and pharma will
beat a path to your door - no laws needed.

In this context, it is at least refreshing to read a proposal to increase efficacy standards. This comes from two bioethicists at McGill University, who make the somewhat-startling case for a higher degree of efficacy evaluation before a drug begins any testing in humans.
We contend that a lack of emphasis on evidence for the efficacy of drug candidates is all too common in decisions about whether an experimental medicine can be tested in humans. We call for infrastructure, resources and better methods to rigorously evaluate the clinical promise of new interventions before testing them on humans for the first time.
The author propose some sort of centralized clearinghouse to evaluate efficacy more rigorously. It is unclear what they envision this new multispecialty review body’s standards for green-lighting a drug to enter human testing. Instead they propose three questions:
  • What is the likelihood that the drug will prove clinically useful?
  • Assume the drug works in humans. What is the likelihood of observing the preclinical results?
  • Assume the drug does not work in humans. What is the likelihood of observing the preclinical results?
These seem like reasonable questions, I suppose – and are likely questions that are already being asked of preclinical data. They certainly do not rise to the level of providing a clear standard for regulatory approval, though perhaps it’s a reasonable place to start.

The most obvious counterargument here is one that the authors curiously don’t pick up on at all: if we had the ability to accurately (or even semiaccurately) predict efficacy preclinically, pharma sponsors would already be doing it. The comment notes: “More-thorough assessments of clinical potential before trials begin could lower failure rates and drug-development costs.” And it’s hard not to agree: every pharmaceutical company would love to have even an incrementally-better sense of whether their early pipeline drugs will be shown to work as hoped.

The authors note
Commercial interests cannot be trusted to ensure that human trials are launched only when the case for clinical potential is robust. We believe that many FIH studies are launched on the basis of flimsy, underscrutinized evidence.
However, they do not produce any evidence that industry is in any way deliberately underperforming their preclinical work, merely that preclinical efficacy is often difficult to reproduce and is poorly correlated with drug performance in humans.

Pharmaceutical companies have many times more candidate compounds than they can possibly afford to put into clinical trials. Figuring out how to lower failure rates – or at least the total cost of failure - is a prominent industry obsession, and efficacy remains the largest source of late-stage trial failure. This quest to “fail faster” has resulted in larger and more expensive phase 2 trials, and even to increased efficacy testing in some phase 1 trials. And we do this not because of regulatory pressure, but because of hopes that these efforts will save overall costs. So it seems beyond probable that companies would immediately invest more in preclinical efficacy testing, if such testing could be shown to have any real predictive power. But generally speaking, it does not.

As a general rule, we don’t need regulations that are firmly aligned with market incentives, we need regulations if and when we think those incentives might run counter to the general good. In this case, there are already incredibly strong market incentives to improve preclinical assessments. Where companies have attempted to do something with limited success, it would seem quixotic to think that regulatory fiat will accomplish more.

(One further point. The authors try to link the need for preclinical efficacy testing to the 2016 Bial tragedy. This seems incredibly tenuous: the authors speculate that perhaps trial participants would not have been harmed and killed if Bial had been required to produce more evidence of BIA102474’s clinical efficacy before embarking on their phase 1 trials. But that would have been entirely coincidental in this case: if the drug had in fact more evidence of therapeutic promise, the tragedy still would have happened, because it had nothing at all to do with the drug’s efficacy.

This is to some extent a minor nitpick, since the argument in favor of earlier efficacy testing does not depend on a link to Bial. However, I bring it up because a) the authors dedicate the first four paragraphs of their comment to the link, and b) there appears to be a minor trend of using the death and injuries of that trial to justify an array of otherwise-unrelated initiatives. This seems like a trend we should discourage.)

[Update 2/23: I posted this last night, not realizing that only a few hours earlier, John LaMattina had published on this same article. His take is similar to mine, in that he is suspicious of the idea that pharmaceutical companies would knowingly push ineffective drugs up their pipeline.]

Kimmelman, J., & Federico, C. (2017). Consider drug efficacy before first-in-human trials Nature, 542 (7639), 25-27 DOI: 10.1038/542025a




ng

More young people are surviving cancer. Then they face a life altered by it

More people are getting cancer in their 20s, 30s, and 40s, and surviving, thanks to rapid advancement in care. Many will have decades of life ahead of them, which means they face greater and more complex challenges in survivorship. Lourdes Monje is navigating these waters at age 29.




ng

Patrick Dempsey aims to raise awareness of cancer disparities and encourage screening

NPR's Leila Fadel talks with actor Patrick Dempsey about his efforts to raise money for cancer treatment and prevention.




ng

With Trump coming into power, the NIH is in the crosshairs

The National Institutes of Health, the crown jewel of biomedical research in the U.S., could face big changes under the new Trump administration, some fueled by pandemic-era criticisms of the agency.




ng

Chronic itch is miserable. Scientists are just scratching the surface

Journalist Annie Lowrey has a rare disease that causes a near-constant itch that doesn't respond to most treatments. She likens the itchiness to a car alarm: "You can't stop thinking about it."




ng

Kumpulan Game Slot Gacor Dengan Persentase RTP Tertinggi Hari Ini

Dalam dunia perjudian online yang terus berkembang, pencarian para pemain untuk menemukan peluang terbaik dalam meraih kemenangan mengarah pada fenomena populer: kumpulan game slot gacor dengan persentase RTP tertinggi hari…

The post Kumpulan Game Slot Gacor Dengan Persentase RTP Tertinggi Hari Ini appeared first on Biosimilarnews.




ng

Tips Rahasia Menang Mudah Main Slot Online Gacor

Mengungkap rahasia menang mudah dalam bermain slot online gacor menjadi dambaan setiap pemain judi daring. Pertama, perhatikan dengan seksama pemilihan mesin slot yang tepat. Pilihlah mesin dengan tingkat pembayaran atau…

The post Tips Rahasia Menang Mudah Main Slot Online Gacor appeared first on Biosimilarnews.




ng

Game Slot Gacor Gampang Menang Habanero

Habanero tidak hanya menyajikan game slot biasa, melainkan sebuah petualangan menang tanpa batas. Dengan tema-tema yang beragam, mulai dari petualangan antariksa hingga ke dunia mitologi, setiap game Habanero memiliki keunikan…

The post Game Slot Gacor Gampang Menang Habanero appeared first on Biosimilarnews.




ng

Link Daftar Situs Slot Gacor Gampang Menang Maxwin Terpercaya Hari Ini

Keuntungan besar dan kegembiraan yang ditawarkan oleh mesin slot online membuatnya semakin populer. Namun, dalam lautan situs slot yang ada, bagaimana Anda bisa menemukan situs slot terbaik yang dapat memberikan…

The post Link Daftar Situs Slot Gacor Gampang Menang Maxwin Terpercaya Hari Ini appeared first on Biosimilarnews.




ng

MRI Sheds Its Shielding and Superconducting Magnets



Magnetic resonance imaging (MRI) has revolutionized healthcare by providing radiation-free, non-invasive 3-D medical images. However, MRI scanners often consume 25 kilowatts or more to power magnets producing magnetic fields up to 1.5 tesla. These requirements typically limits scanners’ use to specialized centers and departments in hospitals.

A University of Hong Kong team has now unveiled a low-power, highly simplified, full-body MRI device. With the help of artificial intelligence, the new scanner only requires a compact 0.05 T magnet and can run off a standard wall power outlet, requiring only 1,800 watts during operation. The researchers say their new AI-enabled machine can produce clear, detailed images on par with those from high-power MRI scanners currently used in clinics, and may one day help greatly improve access to MRI worldwide.

To generate images, MRI applies a magnetic field to align the poles of the body’s protons in the same direction. An MRI scanner then probes the body with radio waves, knocking the protons askew. When the radio waves turn off, the protons return to their original alignment, transmitting radio signals as they do so. MRI scanners receive these signals, converting them into images.

More than 150 million MRI scans are conducted worldwide annually, according to the Organization for Economic Cooperation and Development. However, despite five decades of development, clinical MRI procedures remain out of reach for more than two-thirds of the world’s population, especially in low- and middle-income countries. For instance, whereas the United States has 40 scanners per million inhabitants, in 2016 there were only 84 MRI units serving West Africa’s population of more than 370 million.

This disparity largely stems from the high costs and specialized settings required for standard MRI scanners. They use powerful superconducting magnets that require a lot of space, power, and specialized infrastructure. They also need rooms shielded from radio interference, further adding to hardware costs, restricting their mobility, and hampering their availability in other medical settings.

Scientists around the globe have already been exploring low-cost MRI scanners that operate at ultra-low-field (ULF) strengths of less than 0.1 T. These devices may consume much less power and prove potentially portable enough for bedside use. Indeed, as the Hong Kong team notes, MRI development initially focused on low fields of about 0.05 T, until the introduction of the first whole-body 1.5 T superconducting scanner by General Electric in 1983.

The new MRI scanner (top left) is smaller than conventional scanners, and does away with bulky RF shielding and superconducting magnetics. The new scanner’s imaging resolution is on par with conventional scanners (bottom).Ed X. Wu/The University of Hong Kong

Current ULF MRI scanners often rely on AI to help reconstruct images from what signals they gather using relatively weak magnetic fields. However, until now, these devices were limited to solely imaging the brain, extremities, or single organs, Udunna Anazodo, an assistant professor of neurology and neurosurgery at McGill University in Montreal who did not take part in the work, notes in a review of the new study.

The Hong Kong team have now developed a whole-body ULF MRI scanner in which patients are placed between two permanent neodymium ferrite boron magnet plates—one above the body and the other below. Although these permanent magnets are far weaker than superconductive magnets, they are low-cost, readily available, and don’t require liquid helium or to be cooled to superconducting temperatures. In addition, the amount of energy ULF MRI scanners deposit into the body is roughly one-thousandth that from conventional scanners, making heat generation during imaging much less of a concern, Anazodo notes in her review. ULF MRI is also much quieter than regular MRI, which may help with pediatric scanning, she adds.

The new machine consists of two units, each roughly the size of a hospital gurney. One unit houses the MRI device, while the other supports the patient’s body as it slides into the scanner.

To account for radio interference from both the outside environment and the ULF MRI’s own electronics, the scientists deployed 10 small sensor coils around the scanner and inside the electronics cabinet to help the machine detect potentially disruptive radio signals. They also employed deep learning AI methods to help reconstruct images even in the presence of strong noise. They say this eliminates the need for shielding against radio waves, making the new device far more portable than conventional MRI.

In tests on 30 healthy volunteers, the device captured detailed images of the brain, spine, abdomen, heart, lung, and extremities. Scanning each of these targets took eight minutes or less for image resolutions of roughly 2 by 2 by 8 cubic millimeters. In Anazodo’s review, she notes the new machine produced image qualities comparable to those of conventional MRI scanners.

“It’s the beginning of a multidisciplinary endeavor to advance an entirely new class of simple, patient-centric and computing-powered point-of-care diagnostic imaging device,” says Ed Wu, a professor and chair of biomedical engineering at the University of Hong Kong.

The researchers used standard off-the-shelf electronics. All in all, they estimate hardware costs at about US $22,000. (According to imaging equipment company Block Imaging in Holt, Michigan, entry-level MRI scanners start at $225,000, and advanced premium machines can cost $500,000 or more.)

The prototype scanner’s magnet assembly is relatively heavy, weighing about 1,300 kilograms. (This is still lightweight compared to a typical clinical MRI scanner, which can weigh up to 17 tons, according to New York University’s Langone Health center.) The scientists note that optimizing the hardware could reduce the magnet assembly’s weight to about 600 kilograms, which would make the entire scanner mobile.

The researchers note their new device is not meant to replace conventional high-magnetic-field MRI. For instance, a 2023 study notes that next-generation MRI scanners using powerful 7 T magnets could yield a resolution of just 0.35 millimeters. Instead, ULF MRI can complement existing MRI by going to places that can’t host standard MRI devices, such as intensive care units and community clinics.

In an email, Anazodo adds this new Hong Kong work is just one of a number of exciting ULF MRI scanners under development. For instance, she notes that Gordon Sarty at the University of Saskatchewan and his colleagues are developing that device that is potentially even lighter, cheaper and more portable than the Hong Kong machine, which they are researching for use in whole-body imaging on the International Space Station.

Wu and his colleagues detailed their findings online 10 May in the journal Science.

This article appears in the July 2024 print issue as “Compact MRI Ditches Superconducting Magnets.”




ng

Microneedle Glucose Sensors Keep Monitoring Skin-Deep



For people with diabetes, glucose monitors are a valuable tool to monitor their blood sugar. The current generation of these biosensors detect glucose levels with thin, metallic filaments inserted in subcutaneous tissue, the deepest layer of the skin where most body fat is stored.

Medical technology company Biolinq is developing a new type of glucose sensor that doesn’t go deeper than the dermis, the middle layer of skin that sits above the subcutaneous tissue. The company’s “intradermal” biosensors take advantage of metabolic activity in shallower layers of skin, using an array of electrochemical microsensors to measure glucose—and other chemicals in the body—just beneath the skin’s surface.

Biolinq just concluded a pivotal clinical trial earlier this month, according to CEO Rich Yang, and the company plans to submit the device to the U.S. Food and Drug Administration for approval at the end of the year. In April, Biolinq received US $58 million in funding to support the completion of its clinical trials and subsequent submission to the FDA.

Biolinq’s glucose sensor is “the world’s first intradermal sensor that is completely autonomous,” Yang says. While other glucose monitors require a smartphone or other reader to collect and display the data, Biolinq’s includes an LED display to show when the user’s glucose is within a healthy range (indicated by a blue light) or above that range (yellow light). “We’re providing real-time feedback for people who otherwise could not see or feel their symptoms,” Yang says. (In addition to this real-time feedback, the user can also load long-term data onto a smartphone by placing it next to the sensor, like Abbott’s FreeStyle Libre, another glucose monitor.)

More than 2,000 microsensor components are etched onto each 200-millimeter silicon wafer used to manufacture the biosensors.Biolinq

Biolinq’s hope is that its approach could lead to sustainable changes in behavior on the part of the individual using the sensor. The device is intentionally placed on the upper forearm to be in plain sight, so users can receive immediate feedback without manually checking a reader. “If you drink a glass of orange juice or soda, you’ll see this go from blue to yellow,” Yang explains. That could help users better understand how their actions—such as drinking a sugary beverage—change their blood sugar and take steps to reduce that effect.

Biolinq’s device consists of an array of microneedles etched onto a silicon wafer using semiconductor manufacturing. (Other glucose sensors’ filaments are inserted with an introducer needle.) Each chip has a small 2-millimeter by 2-millimeter footprint and contains seven independent microneedles, which are coated with membranes through a process similar to electroplating in jewelry making. One challenge the industry has faced is ensuring that microsensors do not break at this small scale. The key engineering insight Biolinq introduced, Yang says, was using semiconductor manufacturing to build the biosensors. Importantly, he says, silicon “is harder than titanium and steel at this scale.”

Miniaturization allows for sensing closer to the surface of the skin, where there is a high level of metabolic activity. That makes the shallow depth ideal for monitoring glucose, as well as other important biomarkers, Yang says. Due to this versatility, combined with the use of a sensor array, the device in development can also monitor lactate, an important indicator of muscle fatigue. With the addition of a third data point, ketones (which are produced when the body burns fat), Biolinq aims to “essentially have a metabolic panel on one chip,” Yang says.

Using an array of sensors also creates redundancy, improving the reliability of the device if one sensor fails or becomes less accurate. Glucose monitors tend to drift over the course of wear, but with multiple sensors, Yang says that drift can be better managed.

One downside to the autonomous display is the drain on battery life, Yang says. The battery life limits the biosensor’s wear time to 5 days in the first-generation device. Biolinq aims to extend that to 10 days of continuous wear in its second generation, which is currently in development, by using a custom chip optimized for low-power consumption rather than off-the-shelf components.

The company has collected nearly 1 million hours of human performance data, along with comparators including commercial glucose monitors and venous blood samples, Yang says. Biolinq aims to gain FDA approval first for use in people with type 2 diabetes not using insulin and later expand to other medical indications.

This article appears in the August 2024 print issue as “Glucose Monitor Takes Page From Chipmaking.”




ng

Bath Engineers Bet on Dirt for Micropower



A thimbleful of soil can contain a universe of microorganisms, up to 10 billion by some estimates. Now a group of researchers in Bath, United Kingdom, are building prototype technologies that harvest electrons exhaled by some micro-species.

The idea is to power up low-yield sensors and switches, and perhaps help farmers digitally optimize crop yields to meet increasing demand and more and more stressful growing conditions. There could be other tasks, too, that might make use of a plant-and-forget, low-yield power source—such as monitoring canals for illegal waste dumping.

The research started small, based out of the University of Bath, with field-testing in a Brazilian primary school classroom and a green pond near it—just before the onset of the pandemic.

“We had no idea what the surroundings would be. We just packed the equipment we needed and went,” says Jakub Dziegielowski, a University of Bath, U.K. chemical engineering Ph.D. student. “And the pond was right by the school—it was definitely polluted, very green, with living creatures in it, and definitely not something I’d feel comfortable drinking from. So it got the job done.”

The experiments they did along with kids from the school and Brazilian researchers that summer of 2019 were aimed at running water purifiers. It did so. However, it also wasn’t very efficient, compared to, say, a solar panel.

So work has moved on in the Bath labs: in the next weeks, Dziegielowski will both turn 29 and graduate with his doctorate. And he, along with two other University of Bath advisors and colleagues recently launched a spinoff company—it’s called Bactery—to perfect a prototype for a network of soil microbial fuel cells for use in agriculture.

A microbial fuel cell is a kind of power plant that converts chemical energy stored in organic molecules into electrical energy, using microbes as a catalyst. It’s more often used to refer to liquid-based systems, Dziegielowski says. Organics from wastewater serve as the energy source, and the liquid stream mixes past the electrodes.

A soil microbial fuel cell, however, has one of its electrodes—the anode, which absorbs electrons—in the dirt. The other electrode, the cathode, is exposed to air. Batteries work because ions move through an electrolyte between electrodes to complete a circuit. In this case, the soil itself acts as the electrolyte—as well as source of the catalytic microbes, and as the source of the fuel.

The Bath, U.K.-based startup Bactery has developed a set up fuel cells powered by microbes in the soil—with, in the prototype pictured here, graphite mats as electrodes. University of Bath

Fields full of Watts

In a primary school in the fishing village of Icapuí on Brazil’s semi-arid northeastern coast, the group made use of basic components: graphite felt mats acting as electrodes, and nylon pegs to maintain spacing and alignment between them. (Bactery is now developing new kinds of casing.)

By setting up the cells in a parallel matrix, the Icapuí setup could generate 38 milliwatts per square meter. In work since, the Bath group’s been able to reach 200 milliwatts per square meter.

Electroactive bacteria—also called exoelectrogens or electricigens—take in soluble iron or acids or sugar and exhale electrons. There are dozens of species of microbes that can do this, including bacteria belonging to genera such as Geobacter and Shewanella. There are many others.

But 200 milliwatts per square meter is not a lot of juice: enough to charge a mobile phone, maybe, or keep an LED nightlight going—or, perhaps, serve as a power source for sensors or irrigation switches. “As in so many things, it comes down to the economics,” says Bruce Logan, an environmental engineer at Penn State who wrote a 2007 book, Microbial Fuel Cells.

A decade ago Palo Alto engineers launched the MudWatt, a self-contained kit that could light a small LED. It’s mostly marketed as a school science project. But even now, some 760 million people do not have reliable access to electricity. “In remote areas, soil microbial fuel cells with higher conversion and power management efficiencies would fare better than batteries,” says Sheela Berchmans, a retired chief scientist of the Central Electrochemical Research Institute in Tamil Nadu, India.

Korneel Rabaey, professor in the department of biotechnology at the University of Ghent, in Belgium, says electrochemical micro-power sources—a category that now includes the Bactery battery—is gaining buzz in resource recovery, for uses such as extracting pollutants from wastewater, with electricity as a byproduct. “You can think of many applications that don’t require a lot of power,” he says, “But where sensors are important.”




ng

Startups Launch Life-Saving Tech for the Opioid Crisis



Tech startups are stepping up to meet the needs of 60 million people worldwide who use opioids, representing about 1 percent of the world’s adult population. In the United States, deaths involving synthetic opioids have risen 1,040 percent from 2013 to 2019. The COVID-19 pandemic and continued prevalence of fentanyl have since worsened the toll, with an estimated 81,083 fatal overdoses in 2023 alone.

Innovations include biometric monitoring systems that help doctors determine proper medication dosages, nerve stimulators that relieve withdrawal symptoms, wearable and ingestible systems that watch for signs of an overdose, and autonomous drug delivery systems that could prevent overdose deaths.

Helping Patients Get the Dosage They Need

For decades, opioid blockers and other medications that suppress cravings have been the primary treatment tool for opioid addiction. However, despite its clinical dominance, this approach remains underutilized. In the United States, only about 22 percent of the 2.5 million adults with opioid use disorder receive medication-assisted therapy such as methadone, Suboxone, and similar drugs.

Determining patients’ ideal dosage during the early stages of treatment is crucial for keeping them in recovery programs. The shift from heroin to potent synthetic opioids, like fentanyl, has complicated this process, as the typical recommended medication doses can be too low for those with a high fentanyl tolerance.

A North Carolina-based startup is developing a predictive algorithm to help clinicians tailor these protocols and track real-time progress with biometric data. OpiAID, which is currently working with 1,000 patients across three clinical sites, recently launched a research pilot with virtual treatment provider Bicycle Health. Patients taking Suboxone will wear a Samsung Galaxy Watch6 to measure their heart rate, body movements, and skin temperature. OpiAID CEO David Reeser says clinicians can derive unique stress indications from this data, particularly during withdrawal. (He declined to share specifics on how the algorithm works.)

“Identifying stress biometrically plays a role in how resilient someone will be,” Reeser adds. “For instance, poor heart rate variability during sleep could indicate that a patient may be more susceptible that day. In the presence of measurable amounts of withdrawal, the potential for relapse on illicit medications may be more likely.”

Nerve Stimulators Provide Opioid Withdrawal Relief

While OpiAID’s software solution relies on monitoring patients, electrical nerve stimulation devices take direct action. These behind-the-ear wearables distribute electrodes at nerve endings around the ear and send electrical pulses to block pain signals and relieve withdrawal symptoms like anxiety and nausea.

The U.S. Food and Drug Administration (FDA) has cleared several nerve stimulator devices, such as DyAnsys’ Drug Relief, which periodically administers low-level electrical pulses to the ear’s cranial nerves. Others include Spark Biomedical’s Sparrow system and NET Recovery’s NETNeuro device.

Masimo’s behind-the-ear Bridge device costs US $595 for treatment providers.Masimo

Similarly, Masimo’s Bridge relieves withdrawal symptoms by stimulating the brain and spinal cord via electrodes. The device is intended to help patients initiating, transitioning into, or tapering off medication-assisted treatment. In a clinical trial, Bridge reduced symptom severity by 85 percent in the first hour and 97 percent by the fifth day. A Masimo spokesperson said the company’s typical customers are treatment providers and correctional facilities, though it’s also seeing interest from emergency room physicians.

Devices Monitor Blood Oxygen to Prevent Overdose Deaths

In 2023, the FDA cleared Masimo’s Opioid Halo device to monitor blood oxygen levels and alert emergency contacts if it detects opioid-induced respiratory depression, the leading cause of overdose deaths. The product includes a pulse oximeter cable and disposable sensors connected to a mobile app.

Opioid Halo utilizes Masimo’s signal extraction technology, first developed in the 1990s, which improves upon conventional oxygen monitoring techniques by filtering out artifacts caused by blood movement. Masimo employs four signal-processing engines to distinguish the true signal from noise that can lead to false alarms; for example, they distinguish between arterial blood and low-oxygen venous blood.

Masimo’s Opioid Halo system is available over-the-counter without a prescription. Masimo

Opioid Halo is available over-the-counter for US $250. A spokesperson says sales have continued to show promise as more healthcare providers recommend it to high-risk patients.

An Ingestible Sensor to Watch Over Patients

Last year, in a first-in-human clinical study, doctors used an ingestible sensor to monitor vital signs from patients’ stomachs. Researchers analyzed the breathing patterns and heart rates of 10 sleep study patients at West Virginia University. Some participants had episodes of central sleep apnea, which can be a proxy for opioid-induced respiratory depression. The capsule transmitted this data wirelessly to external equipment linked to the cloud.

Celero’s Rescue-Rx capsule would reside in a user’s stomach for one week.Benjamin Pless/Celero Systems

“To our knowledge, this is the first time anyone has demonstrated the ability to accurately monitor human cardiac and respiratory signals from an ingestible device,” says Benjamin Pless, one of the study’s co-authors. “This was done using very low-power circuitry including a radio, microprocessor, and accelerometer along with software for distinguishing various physiological signals.”

Pless and colleagues from MIT and Harvard Medical School started Celero Systems to commercialize a modified version of that capsule, one that will also release an opioid antagonist after detecting respiratory depression. Pless, Celero’s CEO, says the team has successfully demonstrated the delivery of nalmefene, an opioid antagonist similar to Narcan, to rapidly reverse overdoses.

Celero’s next step is integrating the vitals-monitoring feature for human trials. The company’s final device, Rescue-Rx, is intended to stay in the stomach for one week before passing naturally. Pless says Rescue-Rx’s ingestible format will make the therapy cheaper and more accessible than wearable autoinjectors or implants.

Celero’s capsule can detect vital signs from within the stomach. www.youtube.com

Autonomous Delivery of Overdose Medication

Rescue-Rx isn’t the only autonomous drug-delivery project under development. A recent IEEE Transactions on Biomedical Circuits and Systems paper introduced a wrist-worn near-infrared spectroscopy sensor to detect low blood oxygen levels related to an overdose.

Purdue University biomedical engineering professor Hugh Lee and graduate student Juan Mesa, who both co-authored the study, say that while additional human experiments are necessary, the findings represent a valuable tool in counteracting the epidemic. “Our wearable device consistently detected low-oxygenation events, triggered alarms, and activated the circuitry designed to release the antidote through the implantable capsule,” they wrote in an email.

Lee and Purdue colleagues founded Rescue Biomedical to commercialize the A2D2 system, which includes a wristband and an implanted naloxone capsule that releases the drug if oxygen levels drop below 90 percent. Next, the team will evaluate the closed-loop system in mice.

This story was updated on 27 August 2024 to correct the name of Masimo’s Opioid Halo device.



  • Blood oxygen monitoring
  • Electrical nerve stimulation
  • Opioid addiction treatment
  • Opioids
  • Biometrics

ng

Superconducting Wire Sets New Current Capacity Record



UPDATE 31 OCTOBER 2024: No. 1 no longer. The would-have-been groundbreaking study published in Nature Communications by Amit Goyal et al. claiming the world’s highest-performing high-temperature superconducting wires yet has been retracted by the authors.

The journal’s editorial statement that now accompanies the paper says that after publication, an error in the calculation of the reported performance was identified. All of the study’s authors agreed with the retraction.

The researchers were first alerted to the issue by Evgeny Talantsev at the Mikheev Institute of Metal Physics in Ekaterinburg, Russia, and Jeffery Tallon at the Victoria University of Wellington in New Zealand. In a 2015 study, the two researchers had suggested upper limits for thin-film superconductors, and Tallon notes follow-up papers showed these limits held for more than 100 known superconductors. “The Goyal paper claimed current densities 2.5 times higher, so it was immediately obvious to us that there was a problem here,” he says.

Upon request, Goyal and his colleagues “very kindly agreed to release their raw data and did so quickly,” Tallon says. He and Talantsev discovered a mistake in the conversion of magnetization units.

“Most people who had been in the game for a long time would be fully conversant with the units conversion because the instruments all deliver magnetic data in [centimeter-gram-second] gaussian units, so they always have to be converted to [the International System of Units],” Tallon says. “It has always been a little tricky, but students are asked to take great care and check their numbers against other reports to see if they agree.”

In a statement, Goyal notes he and his colleagues “intend to continue to push the field forward” by continuing to explore ways to enhance wire performance using nanostructural modifications. —Charles Q. Choi

Original article from 17 August, 2024 follows:

Superconductors have for decades spurred dreams of extraordinary technological breakthroughs, but many practical applications for them have remained out of reach. Now a new study reveals what may be the world’s highest-performing high-temperature superconducting wires yet, ones that carry 50 percent as much current as the previous record-holder. Scientists add this advance was achieved without increased costs or complexity to how superconducting wires are currently made.

Superconductors conduct electricity with zero resistance. Classic superconductors work only at super-cold temperatures below 30 degrees Kelvin. In contrast, high-temperature superconductors can operate at temperatures above 77 K, which means they can be cooled to superconductivity using comparatively inexpensive and less burdensome cryogenics built around liquid nitrogen coolant.

Regular electrical conductors all resist electron flow to some degree, resulting in wasted energy. The fact that superconductors conduct electricity without dissipating energy has long lead to dreams of significantly more efficient power grids. In addition, the way in which rivers of electric currents course through them means superconductors can serve as powerful electromagnets, for applications such as maglev trains, better MRI scanners for medicine, doubling the amount of power generated from wind turbines, and nuclear fusion power plants.

“Today, companies around the world are fabricating kilometer-long, high-temperature superconductor wires,” says Amit Goyal, SUNY Distinguished Professor and SUNY Empire Innovation Professor at the University of Buffalo in New York.

However, many large-scale applications for superconductors may stay fantasies until researchers can find a way to fabricate high-temperature superconducting wires in a more cost-effective manner.

In the new research, scientists have created wires that have set new records for the amount of current they can carry at temperatures ranging from 5 K to 77 K. Moreover, fabrication of the new wires requires processes no more complex or costly than those currently used to make high-temperature superconducting wires.

“The performance we have reported in 0.2-micron-thick wires is similar to wires almost 10 times thicker,” Goyal says.

At 4.2 K, the new wires carried 190 million amps per square centimeter without any externally applied magnetic field. This is some 50 percent better than results reported in 2022 and a full 100 percent better than ones detailed in 2021, Goyal and his colleagues note. At 20 K and under an externally applied magnetic field of 20 tesla—the kind of conditions envisioned for fusion applications—the new wires may carry about 9.3 million amps per square centimeter, roughly 5 times greater than present-day commercial high-temperature superconductor wires, they add.

Another factor key to the success of commercial high-temperature superconductor wires is pinning force—the ability to keep magnetic vortices pinned in place within the superconductors that could otherwise interfere with electron flow. (So in that sense higher pinning force values are better here—more conducive to the range of applications expected for such high-capacity, high-temperature superconductors.) The new wires showed record-setting pinning forces of more than 6.4 trillion newtons at 4.3 K under a 7 tesla magnetic field. This is more than twice as much as results previously reported in 2022.

The new wires are based on rare-earth barium copper oxide (REBCO). The wires use nanometer-sized columns of insulating, non-superconducting barium zirconate at nanometer-scale spacings within the superconductor that can help pin down magnetic vortices, allowing for higher supercurrents.

The researchers made these gains after a few years spent optimizing deposition processes, Goyal says. “We feel that high-temperature superconductor wire performance can still be significantly improved,” he adds. “We have several paths to get to better performance and will continue to explore these routes.”

Based on these results, high-temperature superconductor wire manufacturers “will hopefully further optimize their deposition conditions to improve the performance of their wires,” Goyal says. “Some companies may be able to do this in a short time.”

The hope is that superconductor companies will be able to significantly improve performance without too many changes to present-day manufacturing processes. “If high-temperature superconductor wire manufacturers can even just double the performance of commercial high-temperature superconductor wires while keeping capital equipment costs the same, it could make a transformative impact to the large-scale applications of superconductors,” Goyal says.

The scientists detailed their findings on 7 August in the journal Nature Communications.

This story was updated on 19 August 2024 to correct Amit Goyal’s title and affiliation.




ng

Electrical Stitches Speed Wound Healing in Rats



Surgical stitches that generate electricity can help wounds heal faster in rats, a new study from China finds.

In the body, electricity helps the heart beat, causes muscles to contract, and enables the body to communicate with the brain. Now scientists are increasingly using electricity to promote healing with so-called electroceuticals. These electrotherapies often seek to mimic the electrical signals the body naturally uses to help new cells migrate to wounds to support the healing process.

In the new study, researchers focused on sutures, which are used to close wounds and surgical incisions. Despite the way in which medical devices have evolved rapidly over the years, sutures are generally limited in capability, says Zhouquan Sun, a doctoral candidate at Donghua University in Shanghai. “This observation led us to explore integrating advanced therapeutics into sutures,” Sun says.

Prior work sought to enhance sutures by adding drugs or growth factors to the stitches. However, most of these drugs either had insignificant effects on healing, or triggered side-effects such as allergic reactions or nausea. Growth factors in sutures often degraded before they could have any effect, or failed to activate entirely.

The research team that created the new sutures previously developed fibers for electronics for nearly 10 years for applications such as sensors. “This is our first attempt to apply fiber electronics in the biomedical field,” says Chengyi Hou, a professor of materials science and engineering at Donghua University.

Making Electrical Sutures Work

The new sutures are roughly 500 microns wide, or about five times the width of the average human hair. Like typical sutures, the new stitches are biodegradable, avoiding the need for doctors to remove the stitches and potentially cause more damage to a wound.

Each suture is made of a magnesium filament core wrapped in poly(lactic-co-glycolic) acid (PLGA) nanofibers, a commercially available, inexpensive, biodegradable polymer used in sutures. The suture also includes an outer sheath made of polycaprolactone (PCL), a biodegradable polyester and another common suture material.

Previously, electrotherapy devices were often bulky and expensive, and required wires connected to an external battery. The new stitches are instead powered by the triboelectric effect, the most common cause of static electricity. When two different materials repeatedly touch and then separate—in the case of the new suture, its core and sheath—the surface of one material can steal electrons from the surface of the other. This is why rubbing feet on a carpet or a running a comb through hair can build up electric charge.

A common problem sutures face is how daily movements may cause strain that reduce their efficacy. The new stitches take advantage of these motions to help generate electricity that helps wounds heal.

The main obstacle the researchers had to surmount was developing a suture that was both thin and strong enough to serve in medicine. Over the course of nearly two years, they tinkered with the molecular weights of the polymers they used and refined their fiber spinning technology to reduce their suture’s diameter while maintaining strength, Sun says.

In lab experiments on rats, the sutures generated about 2.3 volts during normal exercise. The scientists found the new sutures could speed up wound healing by 50 percent over the course of 10 days compared to conventional sutures. They also significantly lowered bacteria levels even without the use of daily wound disinfectants, suggesting they could reduce the risk of post-operation infections.

“Future research may delve deeper into the molecular mechanisms of how electrical stimulation facilitated would healing,” says Hui Wang, a chief physician at Shanghai Sixth People’s Hospital.

Further tests are needed in clinical settings to assess how effective these sutures are in humans. If such experiments prove successful, “this bioabsorbable electrically stimulating suture could change how we treat injuries in the future,” Hou says.

The scientists detailed their findings online 8 October in the journal Nature Communications.




ng

Bluetooth Microscope Reveals the Inner Workings of Mice



This article is part of our exclusive IEEE Journal Watch series in partnership with IEEE Xplore.

Any imaging technique that allows scientists to observe the inner workings of a living organism, in real-time, provides a wealth of information compared to experiments in a test tube. While there are many such imaging approaches in existence, they require test subjects—in this case rodents—to be tethered to the monitoring device. This limits the ability of animals under study to roam freely during experiments.

Researchers have recently designed a new microscope with a unique feature: It’s capable of transmitting real-time imaging from inside live mice via Bluetooth to a nearby phone or laptop. Once the device has been further miniaturized, the wireless connection will allow mice and other test subject animals to roam freely, making it easier to observe them in a more natural state.

“To the best of our knowledge, this is the first Bluetooth wireless microscope,” says Arvind Pathak, a professor at the Johns Hopkins University School of Medicine.

Through a series of experiments, Pathak and his colleagues demonstrate how the novel wireless microscope, called BLEscope, offers continuous monitoring of blood vessels and tumors in the brains of mice. The results are described in a study published 24 September in IEEE Transactions on Biomedical Engineering.

Microscopes have helped shed light on many biological mysteries, but the devices typically require that cells be removed from an organism and studied in a test tube. Any opportunity to study the biological process as it naturally occurs in the in the body (“in vivo”) tends to offer more useful and thorough information.

Several different miniature microscopes designed for in vivo experiments in animals exist. However, Pathak notes that these often require high power consumption or a wire to be tethered to the device to transmit the data—or both—which may restrict an animal’s natural movements and behavior.

“To overcome these hurdles, [Johns Hopkins University Ph.D. candidate] Subhrajit Das and our team designed an imaging system that operates with ultra-low power consumption—below 50 milliwatts—while enabling wireless data transmission and continuous, functional imaging at spatial resolutions of 5 to 10 micrometers in [rodents],” says Pathak.

The researchers created BLEscope using an off-the-shelf, low-power image sensor and microcontroller, which are integrated on a printed circuit board. Importantly, it has two LED lights of different colors—green and blue—that help create contrast during imaging.

“The BLE protocol enabled wireless control of the BLEscope, which then captures and transmits images wirelessly to a laptop or phone,” Pathak explains. “Its low power consumption and portability make it ideal for remote, real-time imaging.”

Pathak and his colleagues tested BLEscope in live mice through two experiments. In the first scenario, they added a fluorescent marker into the blood of mice and used BLEscope to characterize blood flow within the animals’ brains in real-time. In the second experiment, the researchers altered the oxygen and carbon dioxide ratios of the air being breathed in by mice with brain tumors, and were able to observe blood vessel changes in the fluorescently marked tumors.

“The BLEscope’s key strength is its ability to wirelessly conduct high-resolution, multi-contrast imaging for up to 1.5 hours, without the need for a tethered power supply,” Pathak says.

However, Pathak points out that the current prototype is limited by its size and weight. BLEscope will need to be further miniaturized, so that it doesn’t interfere with animals’ abilities to roam freely during experiments.

“We’re planning to miniaturize the necessary electronic components onto a flexible light-weight printed circuit board, which would reduce weight and footprint of the BLEscope to make it suitable for use on freely moving animals,” says Pathak.

This story was updated on 14 October 2024, to correct a statement about the size of the BLEscope.




ng

Dean Kamen Says Inventing Is Easy, but Innovating Is Hard



This article is part of our special report, “Reinventing Invention: Stories from Innovation’s Edge.”

Over the past 20 years, technological advances have enabled inventors to go from strength to strength. And yet, according to the legendary inventor Dean Kamen, innovation has stalled. Kamen made a name for himself with inventions including the first portable insulin pump for diabetics, an advanced wheelchair that can climb steps, and the Segway mobility device. Here, he talks about his plan for enabling innovators.

How has inventing changed since you started in the 1990s?

Dean Kamen: Kids all over the world can now be inventing in the world of synthetic biology the way we played with Tinkertoys and Erector Sets and Lego. I used to put pins and smelly formaldehyde in frogs in high school. Today in high school, kids will do experiments that would have won you the Nobel Prize in Medicine 40 years ago. But none of those kids are likely in any short time to be on the market with a pharmaceutical that will have global impact. Today, while invention is getting easier and easier, I think there are some aspects of innovation that have gotten much more difficult.

Can you explain the difference?

Kamen: Most people think those two words mean the same thing. Invention is coming up with an idea or a thing or a process that has never been done that way before. [Thanks to] more access to technology and 3D printers and simulation programs and virtual ways to make things, the threshold to be able to create something new and different has dramatically lowered.

Historically, inventions were only the starting point to get to innovation. And I’ll define an innovation as something that reached a scale where it impacted a piece of the world, or transformed it: the wheel, steam, electricity, Internet. Getting an invention to the scale it needs to be to become an innovation has gotten easier—if it’s software. But if it’s sophisticated technology that requires mechanical or physical structure in a very competitive world? It’s getting harder and harder to do due to competition, due to global regulatory environments.

[For example,] in proteomics [the study of proteins] and genomics and biomedical engineering, the invention part is, believe it or not, getting a little easier because we know so much, because there are development platforms now to do it. But getting a biotech product cleared by the Food and Drug Administration is getting more expensive and time consuming, and the risks involved are making the investment community much more likely to invest in the next version of Angry Birds than curing cancer.

A lot of ink has been spilled about how AI is changing inventing. Why hasn’t that helped?

Kamen: AI is an incredibly valuable tool. As long as the value you’re looking for is to be able to collect massive amounts of data and being able to process that data effectively. That’s very different than what a lot of people believe, which is that AI is inventing and creating from whole cloth new and different ideas.

How are you using AI to help with innovation?

Kamen: Every medical school has incredibly brilliant professors and grad students with petri dishes. “Look, I can make nephrons. We can grow people a new kidney. They won’t need dialysis.” But they only have petri dishes full of the stuff. And the scale they need is hundreds and hundreds of liters.

I started a not-for-profit called ARMI—the Advanced Regenerative Manufacturing Institute—to help make it practical to manufacture human cells, tissues, and organs. We are using artificial intelligence to speed up our development processes and eliminate going down frustratingly long and expensive [dead-end] paths. We figure out how to bring tissue manufacturing to scale. We build the bioreactors, sensor technologies, robotics, and controls. We’re going to put them together and create an industry that can manufacture hundreds of thousands of replacement kidneys, livers, pancreases, lungs, blood, bone, you name it.

So ARMI’s purpose is to help would-be innovators?

Kamen: We are not going to make a product. We’re not even going to make a whole company. We’re going to create baseline core technologies that will enable all sorts of products and companies to emerge to create an entire new industry. It will be an innovation in health care that will lower costs because cures are much cheaper than chronic treatments. We have to break down the barriers so that these fantastic inventions can become global innovations.

This article appears in the November 2024 print issue as “The Inventor’s Inventor.”




ng

Gandhi Inspired a New Kind of Engineering



This article is part of our special report, “Reinventing Invention: Stories from Innovation’s Edge.”

The teachings of Mahatma Gandhi were arguably India’s greatest contribution to the 20th century. Raghunath Anant Mashelkar has borrowed some of that wisdom to devise a frugal new form of innovation he calls “Gandhian engineering.” Coming from humble beginnings, Mashelkar is driven to ensure that the benefits of science and technology are shared more equally. He sums up his philosophy with the epigram “more from less for more.” This engineer has led India’s preeminent R&D organization, the Council of Scientific and Industrial Research, and he has advised successive governments.

What was the inspiration for Gandhian engineering?

Raghunath Anant Mashelkar: There are two quotes of Gandhi’s that were influential. The first was, “The world has enough for everyone’s need, but not enough for everyone’s greed.” He was saying that when resources are exhaustible, you should get more from less. He also said the benefits of science must reach all, even the poor. If you put them together, it becomes “more from less for more.”

My own life experience inspired me, too. I was born to a very poor family, and my father died when I was six. My mother was illiterate and brought me to Mumbai in search of a job. Two meals a day was a challenge, and I walked barefoot until I was 12 and studied under streetlights. So it also came from my personal experience of suffering because of a lack of resources.

How does Gandhian engineering differ from existing models of innovation?

Mashelkar: Conventional engineering is market or curiosity driven, but Gandhian engineering is application and impact driven. We look at the end user and what we want to achieve for the betterment of humanity.

Most engineering is about getting more from more. Take an iPhone: They keep creating better models and charging higher prices. For the poor it is less from less: Conventional engineering looks at removing features as the only way to reduce costs.

In Gandhian engineering, the idea is not to create affordable [second-rate] products, but to make high technology work for the poor. So we reinvent the product from the ground up. While the standard approach aims for premium price and high margins, Gandhian engineering will always look at affordable price, but high volumes.

The Jaipur foot is a light, durable, and affordable prosthetic.Gurinder Osan/AP

What is your favorite example of Gandhian engineering?

Mashelkar: My favorite is the Jaipur foot. Normally, a sophisticated prosthetic foot costs a few thousand dollars, but the Jaipur foot does it for [US] $20. And it’s very good technology; there is a video of a person wearing a Jaipur foot climbing a tree, and you can see the flexibility is like a normal foot. Then he runs one kilometer in 4 minutes, 30 seconds.

What is required for Gandhian engineering to become more widespread?

Mashelkar: In our young people, we see innovation and we see passion, but compassion is the key. We also need more soft funding [grants or zero-interest loans], because venture capital companies often turn out to be “vulture capital” in a way, because they want immediate returns.

We need a shift in the mindset of businesses—they can make money not just from premium products for those at the top of the pyramid, but also products with affordable excellence designed for large numbers of people.

This article appears in the November 2024 print issue as “The Gandhi Inspired Inventor.”




ng

For this Stanford Engineer, Frugal Invention Is a Calling



Manu Prakash spoke with IEEE Spectrum shortly after returning to Stanford University from a month aboard a research vessel off the coast of California, where he was testing tools to monitor oceanic carbon sequestration. The associate professor conducts fieldwork around the world to better understand the problems he’s working on, as well as the communities that will be using his inventions.

This article is part of our special report, “Reinventing Invention: Stories from Innovation’s Edge.”

Prakash develops imaging instruments and diagnostic tools, often for use in global health and environmental sciences. His devices typically cost radically less than conventional equipment—he aims for reductions of two or more orders of magnitude. Whether he’s working on pocketable microscopes, mosquito or plankton monitors, or an autonomous malaria diagnostic platform, Prakash always includes cost and access as key aspects of his engineering. He calls this philosophy “frugal science.”

Why should we think about science frugally?

Manu Prakash: To me, when we are trying to ask and solve problems and puzzles, it becomes important: In whose hands are we putting these solutions? A frugal approach to solving the problem is the difference between 1 percent of the population or billions of people having access to that solution.

Lack of access creates these kinds of barriers in people’s minds, where they think they can or cannot approach a kind of problem. It’s important that we as scientists or just citizens of this world create an environment that feels that anybody has a chance to make important inventions and discoveries if they put their heart to it. The entrance to all that is dependent on tools, but those tools are just inaccessible.

How did you first encounter the idea of “frugal science”?

Prakash: I grew up in India and lived with very little access to things. And I got my Ph.D. at MIT. I was thinking about this stark difference in worlds that I had seen and lived in, so when I started my lab, it was almost a commitment to [asking]: What does it mean when we make access one of the critical dimensions of exploration? So, I think a lot of the work I do is primarily driven by curiosity, but access brings another layer of intellectual curiosity.

How do you identify a problem that might benefit from frugal science?

Prakash: Frankly, it’s hard to find a problem that would not benefit from access. The question to ask is “Where are the neglected problems that we as a society have failed to tackle?” We do a lot of work in diagnostics. A lot [of our solutions] beat the conventional methods that are neither cost effective nor any good. It’s not about cutting corners; it’s about deeply understanding the problem—better solutions at a fraction of the cost. It does require invention. For that order of magnitude change, you really have to start fresh.

Where does your involvement with an invention end?

Prakash: Inventions are part of our soul. Your involvement never ends. I just designed the 415th version of Foldscope [a low-cost “origami” microscope]. People only know it as version 3. We created Foldscope a long time ago; then I realized that nobody was going to provide access to it. So we went back and invented the manufacturing process for Foldscope to scale it. We made the first 100,000 Foldscopes in the lab, which led to millions of Foldscopes being deployed.

So it’s continuous. If people are scared of this, they should never invent anything [laughs], because once you invent something, it’s a lifelong project. You don’t put it aside; the project doesn’t put you aside. You can try to, but that’s not really possible if your heart is in it. You always see problems. Nothing is ever perfect. That can be ever consuming. It’s hard. I don’t want to minimize this process in any way or form.




ng

Scary Stories: Establishing a Field Amid Skepticism



In the spirit of the Halloween season, IEEE Spectrum presents a pair of stories that—although grounded in scientific truth rather than the macabre—were no less harrowing for those who lived them. In today’s installment, Robert Langer had to push back against his field’s conventional wisdom to pioneer a drug-delivery mechanism vital to modern medicine.

Nicknamed the Edison of Medicine, Robert Langer is one of the world’s most-cited researchers, with over 1,600 published papers, 1,400 patents, and a top-dog role as one of MIT’s nine prestigious Institute Professors. Langer pioneered the now-ubiquitous drug delivery systems used in modern cancer treatments and vaccines, indirectly saving countless lives throughout his 50-year career.

But, much like Edison and other inventors, Langer’s big ideas were initially met with skepticism from the scientific establishment.

He came up in the 1970s as a chemical engineering postdoc working in the lab of Dr. Judah Folkman, a pediatric surgeon at the Boston Children’s Hospital. Langer was tasked with solving what many believed was an impossible problem—isolating angiogenesis inhibitors to halt cancer growth. Folkman’s vision of stopping tumors from forming their own self-sustaining blood vessels was compelling enough, but few believed it possible.

Langer encountered both practical and social challenges before his first breakthrough. One day, a lab technician accidentally spilled six months’ worth of samples onto the floor, forcing him to repeat the painstaking process of dialyzing extracts. Those months of additional work steered Langer’s development of novel microspheres that could deliver large molecules of medicine directly to tumors.

In the 1970s, Langer developed these tiny microspheres to release large molecules through solid materials, a groundbreaking proof-of-concept for drug delivery.Robert Langer

Langer then submitted the discovery to prestigious journals and was invited to speak at a conference in Michigan in 1976. He practiced the 20-minute presentation for weeks, hoping for positive feedback from respected materials scientists. But when he stepped off the podium, a group approached him and said bluntly, “We don’t believe anything you just said.” They insisted that macromolecules were simply too large to pass through solid materials, and his choice of organic solvents would destroy many inputs. Conventional wisdom said so.

Nature published Langer’s paper three months later, demonstrating for the first time that non-inflammatory polymers could enable the sustained release of proteins and other macromolecules. The same year, Science published his isolation mechanism to restrict tumor growth.

Langer and Folkman’s research paved the way for modern drug delivery.MIT and Boston Children’s Hospital

Even with impressive publications, Langer still struggled to secure funding for his work in controlling macromolecule delivery, isolating the first angiogenesis inhibitors, and testing their behavior. His first two grant proposals were rejected on the same day, a devastating blow for a young academic. The reviewers doubted his experience as “just an engineer” who knew nothing about cancer or biology. One colleague tried to cheer him up, saying, “It’s probably good those grants were rejected early in your career. Since you’re not supporting any graduate students, you don’t have to let anyone go.” Langer thought the colleague was probably right, but the rejections still stung.

His patent applications, filed alongside Folkman at the Boston Children’s Hospital, were rejected five years in a row. After all, it’s difficult to prove you’ve got something good if you’re the only one doing it. Langer remembers feeling disappointed but not crushed entirely. Eventually, other scientists cited his findings and expanded upon them, giving Langer and Folkman the validation needed for intellectual property development. As of this writing, the pair’s two studies from 1976 have been cited nearly 2,000 times.

As the head of MIT’s Langer Lab, he often shares these same stories of rejection with early-career students and researchers. He leads a team of over 100 undergrads, grad students, postdoctoral fellows, and visiting scientists, all finding new ways to deliver genetically engineered proteins, DNA, and RNA, among other research areas. Langer’s reputation is further bolstered by the many successful companies he co-founded or advised, like mRNA leader Moderna, which rose to prominence after developing its widely used COVID-19 vaccine.

Langer sometimes thinks back to those early days—the shattered samples, the cold rejections, and the criticism from senior scientists. He maintains that “Conventional wisdom isn’t always correct, and it’s important to never give up—(almost) regardless of what others say.”




ng

What My Daughter’s Harrowing Alaska Airlines Flight Taught Me About Healthcare

As a leader who has committed much of his career to improving healthcare — an industry that holds millions of people’s lives in its hands — I took from this terrifying incident a new guiding principle. Healthcare needs to pursue a zero-failure rate.

The post What My Daughter’s Harrowing Alaska Airlines Flight Taught Me About Healthcare appeared first on MedCity News.




ng

Pregnant and Empowered: Why Trust is the Latest Form of Member Engagement

Three ways health plans can engage, connect with, and delight their pregnant members to nurture goodwill, earn long-term trust, and foster loyal relationships that last.

The post Pregnant and Empowered: Why Trust is the Latest Form of Member Engagement appeared first on MedCity News.




ng

AI is Revolutionizing Healthcare, But Are We Ready for the Ethical Challenges? 

Navigating the regulatory and ethical requirements of different medical data providers across many different countries, as well as safeguarding patient privacy, is a mammoth task that requires extra resources and expertise.  

The post AI is Revolutionizing Healthcare, But Are We Ready for the Ethical Challenges?  appeared first on MedCity News.




ng

Private Equity Is Picking Up Biologics CDMO Avid Bioservices in $1.1B Acquisition

CDMO Avid Bioservices is being acquired by the private equity firms GHO Capital Partners and Ampersand Capital Partners. Avid specializes in manufacturing biologic products for companies at all stages of development.

The post Private Equity Is Picking Up Biologics CDMO Avid Bioservices in $1.1B Acquisition appeared first on MedCity News.




ng

CVS Health Exec: Payers Need to Stop Making Behavioral Health Providers Jump Through Hoops In Order to Participate in Value-Based Care

Value-based care contracting is especially difficult for behavioral health providers, Taft Parsons III, chief psychiatric officer at CVS Health/Aetna, pointed out during a conference this week.

The post CVS Health Exec: Payers Need to Stop Making Behavioral Health Providers Jump Through Hoops In Order to Participate in Value-Based Care appeared first on MedCity News.




ng

4 Things Employers Should Know About Psychedelic Medicines

During a panel discussion at the Behavioral Health Tech conference, experts shared the promise psychedelic medicines hold for mental health and why employers may want to consider offering them as a workplace benefit.

The post 4 Things Employers Should Know About Psychedelic Medicines appeared first on MedCity News.




ng

FDA Takes Step Toward Removal of Ineffective Decongestants From the Market

The FDA has proposed removing oral phenylephrine from its guidelines for over-the-counter drugs due to inefficacy as a decongestant. Use of this ingredient in cold and allergy medicines grew after a federal law required that pseudoephedrine-containing products be kept behind pharmacy counters.

The post FDA Takes Step Toward Removal of Ineffective Decongestants From the Market appeared first on MedCity News.




ng

Measuring Impact in Digital Youth Mental Health: What Investors Look For

Many companies are entering the digital youth mental health space, but it’s important to know which ones are effective, according to a panel of investors at the Behavioral Health Tech conference.

The post Measuring Impact in Digital Youth Mental Health: What Investors Look For appeared first on MedCity News.