Connect with us

Government

Only 2% Of Americans Have Received New COVID Vaccine: CDC

Only 2% Of Americans Have Received New COVID Vaccine: CDC

Authored by Jack Phillips via The Epoch Times,

About 2 percent of all Americans…

Published

on

Only 2% Of Americans Have Received New COVID Vaccine: CDC

Authored by Jack Phillips via The Epoch Times,

About 2 percent of all Americans have received the updated COVID-19 booster shot after it was authorized and recommended by federal health officials several weeks ago, according to updated data provided by the Department of Health and Human Services (HHS).

More than 7 million Americans have taken the updated shot, which is authorized for people aged 6 months and older, said an HHS spokesman. That's approximately 2 percent of all Americans.

“COVID-19 vaccine distribution, which has shifted to the private market, is a lot different than it was last year when the government was distributing them," said a spokesperson for HHS about the vaccination data. It added that the agency is "directly with manufacturers and distributors to ensure that the vaccines are getting to" various locations.

The statement added that 91 percent of Americans aged 12 years and older "can access the vaccine within 5 miles of where they live," adding that 14 million updated boosters for COVID-19 have been shipped to pharmacies and other locations. The vaccine was approved about a month ago by the Food and Drug Administration (FDA) before it was recommended by the U.S. Centers for Disease Control and Prevention (CDC) shortly thereafter.

It came as some people reported that it's difficult to find doses of the new vaccines at local pharmacies and doctors. Jen Kates, a senior vice president at the Kaiser Family Foundation, said on X in September that her vaccine appointment was canceled due to a lack of supply.

The 7 million figure is up since Oct. 6 when Dr. Mandy Cohen, director of the CDC, told reporters that 4 million had received the new vaccines.

The HHS said Thursday that the latest updated vaccinate rate is about the same as the initial bivalent booster shot when it was rolled out in 2022. However, demand for the 2022 booster vaccine was similarly low, according to CDC data.

Data has shown that about 17 percent of the U.S. population got that previous updated shot, or around 56.5 million people.

The updated vaccines were meant to target the COVID-19 XBB.1.5 variant, which was spreading across the United States when companies like Moderna and Pfizer came up with the new version.

Observational data for the bivalent vaccines, or the previous versions, found weak initial effectiveness that quickly waned.

CDC officials made unsupported claims during the briefing, part of a trend for the agency. “These vaccines will prevent severe disease for COVID-19,” Dr. Demetre Daskalakis, director of the CDC’s National Center for Immunization and Respiratory Disease, told reporters. There’s no evidence that’s true.

The only clinical study data for the new shots is from a study Moderna ran that included injecting 50 humans with the company’s updated formation. The result was a higher level of neutralizing antibodies. Officials believe antibodies protect against COVID-19.

Moderna did not provide any clinical efficacy estimates for infection, severe disease, or death. Pfizer said it was running a trial but has not reported any results.

Novavax’s vaccine was authorized later without any new trial data, as CDC officials have said they recommend that shot. Unlike the Moderna and Pfizer vaccines, Novavax doesn't use mRNA technology and is protein-based.

Hospitalizations Down

Despite the recent push for the latest vaccines, data provided by the CDC shows that hospitalizations for COVID-19 have been down for about three consecutive weeks.

For the ending Sept. 30, the hospitalization rate is down by 6 percent, while emergency department visits are down by 14.5 percent and COVID-19 cases are down 1.2 percent, the figures show. Deaths are up 3.8 percent, although health officials have previously said that deaths generally lag behind hospitalizations and case numbers.

In July, COVID-19 hospitalizations had been increasing for several consecutive weeks. CDC historical data suggest that deaths have been relatively low compared with previous years.

But with the release of the Sept. 30 data, hospitalizations have dropped for multiple consecutive weeks.

Dr. Shira Doron, chief infection control officer for Tufts Medicine, told ABC News that that the recent “upswing is not a surge; it’s not even a wave.”

The doctor added: “What we’re seeing is a very gradual and small upward trajectory of cases and hospitalizations, without deaths really going along, which is great news.”

The CDC on Oct. 6 released a report that attempted to push older Americans to get the newest vaccines by saying that COVID-19 is still a "public health threat," namely for people aged 65 and older. The majority of hospitalizations, it said, is occurring among that older demographic, according to the paper.

In the meantime, a handful of hospitals in California, New York state, Massachusetts, and New Jersey have re-implemented mandatory masking—at least for staff.

Several Northern California counties issued a mask mandate for all health care staff starting next month and ending in late April 2024. They include Contra Costa, Sonoma, Alameda, and San Mateo counties. Officials in the Southern California county of San Luis Obispo also issued a vaccine-or-mask mandate earlier this month.

Tyler Durden Mon, 10/16/2023 - 12:20

Read More

Continue Reading

International

Insilico Medicine presents data on AI-designed cancer drugs at 3 major cancer conferences

Clinical stage artificial intelligence (AI) drug discovery company Insilico Medicine (“Insilico”) has been invited to present scientific data on its…

Published

on

Clinical stage artificial intelligence (AI) drug discovery company Insilico Medicine (“Insilico”) has been invited to present scientific data on its novel anti-cancer assets at three major upcoming cancer conferences — the European Society for Medical Oncology (ESMO) conference in Madrid Oct. 20-24, 2023; the Society of Immunotherapy of Cancer (SITC) conference Nov. 1-5, 2023 in San Diego; and the San Antonio Breast Cancer Symposium (SABCS) Dec. 5-9, 2023. 

Credit: Insilico Medicine

Clinical stage artificial intelligence (AI) drug discovery company Insilico Medicine (“Insilico”) has been invited to present scientific data on its novel anti-cancer assets at three major upcoming cancer conferences — the European Society for Medical Oncology (ESMO) conference in Madrid Oct. 20-24, 2023; the Society of Immunotherapy of Cancer (SITC) conference Nov. 1-5, 2023 in San Diego; and the San Antonio Breast Cancer Symposium (SABCS) Dec. 5-9, 2023. 

Small molecule oncology target inhibitors represent the largest part of Insilico’s therapeutic pipeline portfolio, which includes 31 programs across 29 targets. The Company recently entered into a licensing deal with Exelixis on its potentially best-in-class generative AI-designed USP1 inhibitor for BRCA-mutant tumors for $80m upfront and additional milestone payments and tiered royalties. That drug is currently in a Phase I clinical trial. 

“Using our AI platform, we have been able to advance a number of anti-cancer therapeutics that use new mechanisms to stop tumor growth and cancer progression, including two in clinical stage,” says Sujata Rao, MD, Chief Medical Officer at Insilico Medicine. Dr. Rao has extensive experience in clinical oncology practice and over 15 years in pharma leading global clinical development for cancer drugs. “Driven by a strategy of focusing on novelty, confidence, and commercial tractability, and designed to meet the high unmet medical needs of patients, we have developed a number of promising anti-cancer assets and look forward to presenting to the leading cancer conferences.”

Dr. Rao showcased four of the Company’s novel AI-designed cancer inhibitors at the most recent Association for Cancer Research (AACR) annual meeting.

Insilico’s upcoming cancer conference presentations include:

  • ESMO – Oct. 20-24: At ESMO, Insilico will present data on ISM8207, a novel QPCTL inhibitor for triple negative breast cancer and B-cell non-Hodgkin lymphoma. The small molecule inhibitor, currently being evaluated in a Phase I trial, has demonstrated anti-tumor activity in both hematological tumors and solid tumors in preclinical studies, as well as favorable pharmacokinetics and safety profiles. This molecule is available for licensing partners in the U.S., Europe and Japan. 
  • SITC – Nov. 1-5: At SITC, Insilico will present data on ISM5939, a novel, potent, orally available, selective ENPP1 inhibitor cancer immunotherapy for multiple tumor types that enhances the anti-tumor effects of immune checkpoint inhibitors in syngeneic murine cancer models. It is also being advanced as a possible treatment for Hypophosphatasia.
  • SABCS – Dec. 5-9: At SABCS, Insilico will present data on ISM5043, a novel, selective KAT6 inhibitor for the treatment of advanced ER+/HER2- breast cancer – the most common subtype of breast cancer. Current treatment for patients with advanced or metastatic disease is endocrine therapy in combination with CDK4/6 inhibitors but many patients develop resistance to therapy, indicating a huge unmet medical need. 

Insilico is advancing new therapeutics using generative AI via its proprietary end-to-end Pharma.AI platform for identifying novel targets (PandaOmics), designing new drugs (Chemistry42), and predicting the outcomes of clinical trials (InClinico). The platform has produced four drugs that have reached clinical trials, including a lead drug for the devastating chronic lung disease Idiopathic Pulmonary Fibrosis (IPF), the first AI-discovered and AI-designed drug to advance to Phase II trials with patients. 

“We’re really encouraged by the progress of our diverse pipeline of cancer therapeutics – two of which have progressed into clinical trials,” says Alex Zhavoronkov, PhD, founder and CEO of Insilico Medicine. “Our AI can be thought of as a ‘Google for targets’ that looks at every single small signal from massive datasets all over the world, including our own robotics data. It gives us signals that the target is working in a specific cancer, it already has demonstrated some efficacy, and it is going to be commercially tractable.” 

Dr. Rao and Insilico’s Chief Business Officer Michelle Chen, PhD, along with other business development professionals, will be in attendance at the upcoming conferences. For any interest in licensing or partnerships, please contact: bd@insilicomedicine.com.

 

About Insilico Medicine

Insilico Medicine, a global clinical stage biotechnology company powered by generative AI, is connecting biology, chemistry, and clinical trials analysis using next-generation AI systems. The company has developed AI platforms that utilize deep generative models, reinforcement learning, transformers, and other modern machine learning techniques for novel target discovery and the generation of novel molecular structures with desired properties. Insilico Medicine is developing breakthrough solutions to discover and develop innovative drugs for cancer, fibrosis, immunity, central nervous system diseases, infectious diseases, autoimmune diseases, and aging-related diseases. www.insilico.com


Read More

Continue Reading

International

New polymer membranes, AI predictions could dramatically reduce energy, water use in oil refining

A new kind of polymer membrane created by researchers at Georgia Tech could reshape how refineries process crude oil, dramatically reducing the energy…

Published

on

A new kind of polymer membrane created by researchers at Georgia Tech could reshape how refineries process crude oil, dramatically reducing the energy and water required while extracting even more useful materials.

Credit: Candler Hobbs, Georgia Institute of Technology

A new kind of polymer membrane created by researchers at Georgia Tech could reshape how refineries process crude oil, dramatically reducing the energy and water required while extracting even more useful materials.

The so-called DUCKY polymers — more on the unusual name in a minute — are reported Oct. 16 in Nature Materials. And they’re just the beginning for the team of Georgia Tech chemists, chemical engineers, and materials scientists. They also have created artificial intelligence tools to predict the performance of these kinds of polymer membranes, which could accelerate development of new ones.

The implications are stark: the initial separation of crude oil components is responsible for roughly 1% of energy used across the globe. What’s more, the membrane separation technology the researchers are developing could have several uses, from biofuels and biodegradable plastics to pulp and paper products.

“We’re establishing concepts here that we can then use with different molecules or polymers, but we apply them to crude oil because that’s the most challenging target right now,” said M.G. Finn, professor and James A. Carlos Family Chair in the School of Chemistry and Biochemistry.

Crude oil in its raw state includes thousands of compounds that have to be processed and refined to produce useful materials — gas and other fuels, as well as plastics, textiles, food additives, medical products, and more. Squeezing out the valuable stuff involves dozens of steps, but it starts with distillation, a water- and energy-intensive process.

Researchers have been trying to develop membranes to do that work instead, filtering out the desirable molecules and skipping all the boiling and cooling.

“Crude oil is an enormously important feedstock for almost all aspects of life, and most people don’t think about how it’s processed,” said Ryan Lively, Thomas C. DeLoach Jr. Professor in the School of Chemical and Biomolecular Engineering. “These distillation systems are massive water consumers, and the membranes simply are not. They’re not using heat or combustion. They just use electricity. You could ostensibly run it off of a wind turbine, if you wanted. It’s just a fundamentally different way of doing a separation.”

What makes the team’s new membrane formula so powerful is a new family of polymers. The researchers used building blocks called spirocyclic monomers that assemble together in chains with lots of 90-degree turns, forming a kinky material that doesn’t compress easily and forms pores that selectively bind and permit desirable molecules to pass through. The polymers are not rigid, which means they’re easier to make in large quantities. They also have a well-controlled flexibility or mobility that allows pores of the right filtering structure to come and go over time.

The DUCKY polymers are created through a chemical reaction that’s easy to produce at a scale that would be useful for industrial purposes. It’s a flavor of a Nobel Prize-winning family of reactions called click chemistry, and that’s what gives the polymers their name. The reaction is called copper-catalyzed azide-alkyne cycloaddition — abbreviated CuAAC and pronounced “quack.” Thus: DUCKY polymers.

In isolation, the three key characteristics of the polymer membranes aren’t new; it’s their unique combination that makes them a novelty and effective, Finn said.

The research team included scientists at ExxonMobil, who discovered just how effective the membranes could be. The company’s scientists took the crudest of the crude oil components — the sludge left at the bottom after the distillation process — and pushed it through one of the membranes. The process extracted even more valuable materials.

“That’s actually the business case for a lot of the people who process crude oils. They want to know what they can do that’s new. Can a membrane make something new that the distillation column can’t?” Lively said. “Of course, our secret motivation is to reduce energy, carbon, and water footprints, but if we can help them make new products at the same time, that’s a win-win.”

Predicting such outcomes is one way the team’s AI models can come into play. In a related study recently published in Nature Communications, Lively, Finn, and researchers in Rampi Ramprasad’s Georgia Tech lab described using machine learning algorithms and mass transport simulations to predict the performance of polymer membranes in complex separations.

“This entire pipeline, I think, is a significant development. And it’s also the first step toward actual materials design,” said Ramprasad, professor and Michael E. Tennenbaum Family Chair in the School of Materials Science and Engineering. “We call this a ‘forward problem,’ meaning you have a material and a mixture that goes in — what comes out? That’s a prediction problem. What we want to do eventually is to design new polymers that achieve a certain target permeation performance.”

Complex mixtures like crude oil might have hundreds or thousands of components, so accurately describing each compound in mathematical terms, how it interacts with the membrane, and extrapolating the outcome is “non-trivial,” as Ramprasad put it.

Training the algorithms also involved combing through all the experimental literature on solvent diffusion through polymers to build an enormous dataset. But, like the potential of membranes themselves to reshape refining, knowing ahead of time how a proposed polymer membrane might work would accelerate a materias design process that’s basically trial-and-error now, Ramprasad said.

“The default approach is to make the material and test it, and that takes time. This data-driven or machine learning-based approach uses past knowledge in a very efficient manner,” he said. “It’s a digital partner: You’re not guaranteed an exact prediction, because the model is limited by the space spanned by the data you use to train it. But it can extrapolate a little bit and it can take you in new directions, potentially. You can do an initial screening by searching through vast chemical spaces and make go, no-go decisions up front.”

Lively said he’d long been a skeptic about the ability of machine learning tools to tackle the kinds of complex separations he works with.

“I always said, ‘I don’t think you can predict the complexity of transport through polymer membranes. The systems are too big; the physics are too complicated. Can’t do it.’”

But then he met Ramprasad: “Rather than just be a naysayer, Rampi and I took a stab at it with a couple of undergrads, built this big database, and dang. Actually, you can do it,” Lively said.

Developing the AI tools also involved comparing the algorithms’ predictions to actual results, including with the DUCKY polymer membranes. The experiments showed the AI models predictions were within 6% to 7% of actual measurements.

“It’s astonishing,” Finn said. “My career has been spent trying to predict what molecules are going to do. The machine learning approach, and Rampi’s execution of it, is just completely revolutionary.”

This research was supported by the U.S. Department of Energy, grant No. DE-EE0007888; the European Research Council, grant No. 758370; the Kwanjeong Educational Foundation; a Royal Society University Research Fellowship; and the ExxonMobil Technology and Engineering Company. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of any funding agency.


Read More

Continue Reading

International

Study reveals areas of Brazilian Amazon where no ecological research has been done

Many parts of the Brazilian Amazon are neglected in ecological research, for several reasons, according to an article published in the journal Current…

Published

on

Many parts of the Brazilian Amazon are neglected in ecological research, for several reasons, according to an article published in the journal Current Biology. Authored by Joice Ferreira of the Federal University of Pará (UFP) and colleagues from many countries who also belong to the Synergize Consortium, the article identifies the areas missing from ecological research and the factors that have determined these gaps, pinpointing opportunities for the planning of new investments in research in the region.

Credit: Alexander Lees

Many parts of the Brazilian Amazon are neglected in ecological research, for several reasons, according to an article published in the journal Current Biology. Authored by Joice Ferreira of the Federal University of Pará (UFP) and colleagues from many countries who also belong to the Synergize Consortium, the article identifies the areas missing from ecological research and the factors that have determined these gaps, pinpointing opportunities for the planning of new investments in research in the region.

The researchers analyzed data from 7,694 ecological research sites to try to understand how logistics and human influence on the forests could explain the probability of research being done in different parts of the Amazon region. The period analyzed was 2010-20, and the survey covered nine groups of organisms: benthic invertebrates (living on the seabed or in the lowest layers of any water body), heteropterans (true bugs), odonates (dragonflies and damselflies), fish, macrophytes (aquatic plants), birds, woody vegetation, ants, and dung beetles.

“The consortium contacted people who had contributed to databases, standardized inventories and studies involving sampling efforts. Information was thereby compiled on three groups that represent Amazonian biodiversity: vertebrates, invertebrates, and plants in upland forests, flooded forests and aquatic environments – rivers, lakes, etc. This is the first paper published by the group,” said Mario Ribeiro de Moura, a researcher at the State University of Campinas’s Institute of Biology (IB-UNICAMP) in São Paulo, Brazil. He is a co-author of the article and a member of the consortium.

The findings evidenced high susceptibility to climate change by 2050 in 15%-18% of the most neglected areas in the Brazilian Amazon. The least studied areas are also the most threatened in the vicinity of the “deforestation arc”, a swathe of territory extending along the southern, southeastern and eastern borders of Amazonia, mostly in the states of Acre, Amazonas, Maranhão, Mato Grosso, Pará, Rondônia and Tocantins.

The main gaps in Amazonian ecological research were in upland areas. “This was expected and probably reflects the role played by navigable waterways in facilitating access to blackwater and whitewater inundation forest, as well as other aquatic environments,” Moura said.

Not by chance, the least pessimistic scenarios appeared along rivers in northeast Pará and Roraima, southeastern Acre and northern Rondônia. “In these areas, the future impact of climate change will be less severe, and we have more knowledge of the species that live there,” Moura said.

The study was supported by FAPESP via two postdoctoral fellowships in Brazil. One of the fellowships was awarded to Raquel de Carvalho, and the other to Angélica Faria de Resende. Moura was supported by a Young Investigator Grant and a scholarship in Brazil

Research biases

The scientists mapped the most neglected areas of the Amazon region in terms of ecological research and superimposed on this map the areas most likely to be affected by climate change based on a metric they developed to reflect its intensity. Deforestation and degradation data were taken from a recent study published in Science on the drivers of deforestation in the Amazon. The correlations between datasets showed that ecological research in the Amazon is more frequent in deforested areas than areas where deforestation is predicted in the next three decades.

“Environmental change is happening at a very fast pace, including climate change and landscape transformation. To understand how these changes affect biodiversity, we need to know what was in a given area before they happened. The Amazon is one of the last significantly conserved refuges of tropical biodiversity and essential to an understanding of the isolated effect of climate change and habitat destruction on biodiversity,” Moura said. “The study highlighted the areas at risk of environmental change in the coming years and not yet explored by scientists. Without sufficient ecological research, we won’t be able to know what’s changing and what’s being lost.”

With regard to logistics, accessibility and distance to research facilities were key predictors of the probability of research being done. “Access is a mixed blessing, as evidenced by the deforestation arc. Easy access enables researchers to reach more areas, so part of this immense arc has been thoroughly studied, but it also enables those responsible for deforestation and other malefactors to reach these areas. Little information is available on the threatened areas at the edges of the deforestation arc,” Moura said.

Access, and hence research probability, increased with proximity to transportation and research facilities for all upland organisms and most representatives of wetlands and aquatic habitats. “The length of the dry season determines ease of access by water. In flooded forest areas, the shorter the dry season, the easier it is to gain access by river, and this increases the likelihood of research. In upland areas, more severe dry seasons facilitate overland access, with less mud and inundation,” Moura said.

Forest degradation and land tenure were also moderately effective predictors, albeit with consistent importance, across all organism groups. Both factors affected ecological research in the same direction, with research probability slightly declining in more degraded areas and Indigenous territories, but increasing in conservation units. 

In short, less research is done in degraded areas and Indigenous territories, and more in conservation units. “It’s harder to obtain access to Indigenous communities, or there may be a lack of administrative mechanisms that connect researchers with the bodies that regulate such access and with the communities themselves. We need to improve integration between the parties involved, and above all engage local communities in the knowledge creation process. Far more research goes on in conservation units than Indigenous territories, although both are types of protected area,” Moura said.

In Carvalho’s opinion, this is a distribution problem, since Indigenous territories account for some 23% of the total area of the Brazilian Amazon. “At the same time, several Indigenous territories are the best conserved parts of the Amazon biome. It would be very valuable if we could do research there,” she said.

Novel strategies

According to Moura, the Amazon Rainforest is under-represented in global databases used as a source for research on biodiversity. “As noted in the article, we need to integrate the information we have about the Amazon with global databases. The Synergize Consortium has projects that could contribute to global assessments. The information reviewed for this study mostly complies with the requirements of other databases and could be used to improve the representativeness of Amazonian biodiversity in future research on global change. The consortium plans to use this study as a basis for establishing itself as an important collaborative network for other research groups interested in analyzing environmental changes in the Amazon,” he said.

The Synergize Consortium’s principal investigators are Ferreira, who is affiliated with EMBRAPA Amazônia Oriental, a unit of the Brazilian Agricultural Research Corporation (EMBRAPA); and Filipe França, a researcher at the University of Bristol in the United Kingdom. Jos Barlow, a professor at the University of Lancaster, also in the UK, is a co-author of the article and a member of the consortium’s steering committee.

Moura believes the group’s findings can be used to develop novel funding strategies for the Amazon. “Once you’ve identified the gaps, you can target them for investment in conservation and research, or give more weight to research in these areas in future calls for proposals. Public policy and action plans can take these results into consideration, especially as far as biodiversity monitoring and inventorying are concerned,” he said.

About São Paulo Research Foundation (FAPESP)

The São Paulo Research Foundation (FAPESP) is a public institution with the mission of supporting scientific research in all fields of knowledge by awarding scholarships, fellowships and grants to investigators linked with higher education and research institutions in the State of São Paulo, Brazil. FAPESP is aware that the very best research can only be done by working with the best researchers internationally. Therefore, it has established partnerships with funding agencies, higher education, private companies, and research organizations in other countries known for the quality of their research and has been encouraging scientists funded by its grants to further develop their international collaboration. You can learn more about FAPESP at www.fapesp.br/en and visit FAPESP news agency at www.agencia.fapesp.br/en to keep updated with the latest scientific breakthroughs FAPESP helps achieve through its many programs, awards and research centers. You may also subscribe to FAPESP news agency at http://agencia.fapesp.br/subscribe.


Read More

Continue Reading

Trending