Connect with us

Government

Parents Should Not Post Children’s Photos Online, Warn Safety Experts

Parents Should Not Post Children’s Photos Online, Warn Safety Experts

Authored by Masooma Haq via The Epoch Times (emphasis ours),

With…

Published

on

Parents Should Not Post Children's Photos Online, Warn Safety Experts

Authored by Masooma Haq via The Epoch Times (emphasis ours),

With children spending an increasing amount of time on the internet and many uploading photos to their social media accounts, sexual predators continue to steal these images to produce child sexual abuse material (CSAM).

Further compounding the proliferation of CSAM is the easy access to artificial intelligence (AI), and law enforcement agencies and child protective organizations are seeing a dramatic rise in AI-generated CSAM.

Yaron Litwin is a digital safety expert and chief marketing officer at Netspark, the company behind a program called CaseScan that identifies AI-generated CSAM online, aiding law enforcement agencies in their investigations.

Mr. Litwin told The Epoch Times he recommends that parents and teens not post photos on any public forum and that parents talk to their children about the potential dangers of revealing personal information online.

“One of our recommendations is to be a little more cautious with images that are being posted online and really try to keep those within closed networks, where there are only people that you know,” Mr. Litwin said.

The American Academy of Child and Adolescent Psychiatry said in 2020 that on average, children ages 8 to 12 spend four to six hours a day watching or using screens, and teens spend up to 9 hours a day on their devices.

Parents Together, a nongovernmental organization that provides news about issues affecting families, released a report in 2023 (pdf) stating that "despite the bad and worsening risks of online sexual exploitation, 97% of children use social media and the internet every day, and 1 in 5 use it 'almost constantly.'"

One of Netspark’s safety tools is Canopy, an AI-powered tool that gives parents control to filter out harmful sexual digital content for their minor children, said Mr. Litwin, while giving children freedom to explore the internet.

Exploitative Content Expanding

The amount of CSAM online has gone up exponentially since generative AI became mainstream at the start of 2023, Mr. Litwin said. The problem is serious enough that all 50 states have asked Congress to institute a commission to study the problem of AI-generated CSAM, he said.

“There's definitely a correlation between the increase in AI-generated CSAM and when OpenAI and DALL-E and all these generative AI-type platforms launched,” Mr. Litwin said.

The FBI recently warned the public about the rise of AI-generated sexual abuse materials.

“Malicious actors use content manipulation technologies and services to exploit photos and videos—typically captured from an individual’s social media account, open internet, or requested from the victim—into sexually-themed images that appear true-to-life in likeness to a victim, then circulate them on social media, public forums, or pornographic websites,” said the FBI in a recent statement.

Mr. Litwin said that to really protect children from online predators, it is important for parents and guardians to clearly discuss the potential dangers of posting photos and talking to strangers online.

“So just really communicating with our kids about some of these risks and explaining to them that this might happen and making sure that they're aware,” he said.

An artificial intelligence (AI) logo blended with four fake Twitter accounts bearing profile pictures apparently generated by artificial intelligence software taken in Helsinki, Finland, on June 12, 2023. (Olivier Morin/AFP via Getty Images)

Dangers of Generative AI

Roo Powell, founder of Safe from Online Sex Abuse (SOSA), told The Epoch Times that because predators can use the image of a fully-clothed child to create an explicit image using AI, it is best not to post any images of children online, even as toddlers, she said.

“At SOSA, we encourage parents not to publicly share images or videos of their children in diapers or having a bath. Even though their genitals may technically be covered, perpetrators can save this content for their own gratification, or can use AI to make it explicit and then share that widely,” Ms. Powell said in an email.

While some people say AI-generated CSAM is not as harmful as images depicting the sexual abuse of real-life children, many believe it is worse.

AI-generated CSAM is produced much more quickly than conventional images, subsequently inundating law enforcement with even more abuse referrals, and experts in the AI and online parental control space expect the problem to only get worse.

In other cases, the AI-generated CSAM image could be created from a photo taken of a real-life child’s social media account, which is altered to be sexually explicit and thus endangers those otherwise unvictimized children, as well as their parents.

In worst-case scenarios, bad actors use images of real victims of child sexual abuse as a base to create computer-generated images. They can use the original photograph as an initial input, which is then altered according to prompts.

In some cases, the photo’s subject can be made to look younger or older.

In 2023, the Standford Internet Observatory at Stanford University in conjunction with Thorn, a nonprofit focused on technology that helps defend children from abuse, released a report titled "Generative ML and CSAM: Implications and Mitigations," referring to generative machine learning (pdf).

“In just the first few months of 2023, a number of advancements have greatly increased end-user control over image results and their resultant realism, to the point that some images are only distinguishable from reality if the viewer is very familiar with photography, lighting and the characteristics of diffusion model outputs," the report states.

Studies have shown a link between viewing CSAM and sexually abusing children in real life.

In 2010, Canadian forensic psychologist Michael C. Seto and colleagues reviewed several studies and found that 50 to 60 percent of people who viewed CSAM admitted to abusing children themselves.

Even if some AI-generated CSAM images are not created with the images of real children, the images fuel the growth of the child exploitation market by normalizing CSAM and feeding the appetites of those who seek to victimize children.

Sextortion on the Rise

Because of how realistic it is, AI-generated CSAM is facilitating a rise in cases of sextortion.

Sextortion occurs when a predator pretends to be a young person to solicit semi- or fully-nude images from a victim and then extorts money or sexual acts from them under threat of making their images public.

In the case of AI-generated cases, the criminal can alter the victim’s image to make it sexual.

In one case, Mr. Litwin said a teenage weightlifting enthusiast posted a shirtless selfie that was then used by a criminal to create an AI-generated nude photo of him to extort money from the minor.

In other cases, the perpetrator might threaten to disclose the image, damaging the minor’s reputation.  Faced with such a threat, many teens comply with the criminal's demands or end up taking their own lives rather than risk public humiliation.

The National Center for Missing and Exploited Children (NCMEC) operates the CyberTipLine, where citizens can report child sexual exploitation on the internet. In 2022, the tip line received over 32 million reports of CSAM. Although some of the reports are made multiple times about a single viral child sex abuse image, that is still an 82 percent increase from 2021, or close to 87,000 reports per day.

In December 2022, the FBI estimated 3,000 minor sextortion victims.

“The FBI has seen a horrific increase in reports of financial sextortion schemes targeting minor boys—and the fact is that the many victims who are afraid to come forward are not even included in those numbers,” said FBI Director Christopher Wray in a 2022 statement.

The Parents Together report further states that “recent research shows 1 in 3 children can now expect to have an unwelcome sexual experience online before they turn 18.”

In addition, a 2022 report by Thorn (pdf) states that 1 in 6 children say they have shared explicit images of themselves online, and 1 in 4 children say the practice is normal.

An example of texts used by predators to entice children into compromising situations as featured in "Sextortion: The Hidden Pandemic." (Auris Media)

Using Good AI to Fight Bad AI

Prior to the wide availability of AI, editing and generating images required skills and knowledge of image editing software programs. However, AI has made it so quick and easy that even amateur users can generate life-like images.

Netspark is leading the fight against AI-generated CSAM with CaseScan, its own AI-powered cyber safety tool, said Mr. Litwin.

We definitely believe that to fight the bad AI, it's going to come through AI for good, and that's where we're focused,” he said.

Law enforcement agencies must go through massive amounts of images each day and are often unable to get through all of the CSAM reports in a timely manner, said Mr. Litwin, but this is exactly where CaseScan is able to assist investigators.

Unless the police departments are using AI-centered solutions, police spend an extensive amount of time assessing if the child in a photo is a fake AI-generated or an actual sexual abuse victim. Even before AI-generated content, law enforcement and child safety organizations were overwhelmed by the immense volume of CSAM reports.

Under U.S. law, AI-generated CSAM is treated the same as CSAM of real-life children, but Mr. Litwin said he does not know of any AI-generated CSAM case that has been prosecuted, so there is no precedent yet.

“I think today it's hard to take to court, it's hard to create robust cases. And my guess is that that's one reason that we're seeing so much of it,” he said.

To prosecute the producers of this online AI-generated CSAM, laws need to be updated to target the technologically advanced criminal activity committed by sexual predators.

Mr. Litwin said he believes predators will always find a way to circumvent technological limits set up by safety companies because AI is constantly advancing, but Netspark is also adapting to keep up with those that create AI-generated CSAM.

Mr. Litwin said CaseScan has enabled investigators to significantly reduce the amount time it takes to identify AI-generated CSAM and lighten the mental impact on investigators who usually must view the images.

Tech Companies Must Do More

Ms. Powell said social media companies need to do more in the fight against CSAM.

“To effectively help protect kids and teens, Congress can mandate that social media platforms implement effective content moderation systems that identify cases of child abuse and exploitation and escalate those to law enforcement as needed,” she said.

Congress can also require all social media platforms to create parental control features to help mitigate the risk of online predation. These can include the ability for a parent user to turn off all chat/messaging features, restrict who’s following their child, and manage screen time on the app,” she added.

In April, Sen. Dick Durbin (D-Ill.) introduced the STOP CSAM Act of 2023, which includes a provision that would change Section 230 of the Communications Decency Act and allow CSAM victims to sue social media platforms that host, store, or otherwise make this illegal content available.

“If [social media companies] don't put sufficient safety measures in place, they should be held legally accountable,” Mr. Durbin said during a Senate Judiciary subcommittee meeting in June.

Ms. Powell said she believes Congress has a responsibility to do more to keep children safe from abuse.

“Laws need to keep up with the constant evolution of technology,” she said, adding that law enforcement also needs tools to help them work faster.

Improving NCMEC

Reps. Ann Wagner (R-Mo.) and Sylvia Garcia (D-Texas) recently introduced the Child Online Safety Modernization Act of 2023 to fill the gaps in how CSAM is reported, ensuring criminals can be held accountable.

Currently, there are no requirements regarding what online platforms must include in a report to the NCMEC's CyberTipline, often leaving the organization and law enforcement without enough information to locate and rescue the child. In 2022, that amounted to about 50 percent of reports being untraceable.

In addition, the law does not mandate that online platforms report instances of child sex trafficking and enticement.

According to a 2022 report by Thorn, the majority of CyberTipline reports submitted by the tech industry contained such limited information that it was impossible for NCMEC to identify where the offense took place, and therefore the organization could not notify the appropriate law enforcement agency.

The Child Online Safety Modernization Act bolsters the NCMEC CyberTipline by 1) requiring reports from online platforms to include information to identify and locate the child depicted and disseminator of the CSAM; 2) requiring online platforms to report instances of child sex trafficking and the sexual enticement of a child; and 3) allowing NCMEC to share technical identifiers associated with CSAM to nonprofits.

The bill also requires that the reports be preserved for an entire year, giving law enforcement the time they need to investigate the crimes.

Stefan Turkheimer, interim vice president for public policy at RAINN (Rape, Abuse & Incest National Network), said Ms. Wagner’s bill is crucial to aiding law enforcement to successfully investigate CSAM reports.

“The Child Online Safety Modernization Act is a step towards greater cooperation between law enforcement and internet service providers that will support the efforts to investigate, identify, and locate the children depicted in child sexual abuse materials,” Turkheimer said in a recent press statement.

Making this improvement is crucial to stopping the sexual exploitation of children, said Ms. Powell.

“In our collaborations with law enforcement, SOSA has seen a perpetrator go from the very first message to arriving at a minor’s house for sex in under two hours. Anyone with the propensity to harm children can do so quickly and easily from anywhere in the world just through internet access,” she said.

Tyler Durden Tue, 09/12/2023 - 19:25

Read More

Continue Reading

Government

Are Voters Recoiling Against Disorder?

Are Voters Recoiling Against Disorder?

Authored by Michael Barone via The Epoch Times (emphasis ours),

The headlines coming out of the Super…

Published

on

Are Voters Recoiling Against Disorder?

Authored by Michael Barone via The Epoch Times (emphasis ours),

The headlines coming out of the Super Tuesday primaries have got it right. Barring cataclysmic changes, Donald Trump and Joe Biden will be the Republican and Democratic nominees for president in 2024.

(Left) President Joe Biden delivers remarks on canceling student debt at Culver City Julian Dixon Library in Culver City, Calif., on Feb. 21, 2024. (Right) Republican presidential candidate and former U.S. President Donald Trump stands on stage during a campaign event at Big League Dreams Las Vegas in Las Vegas, Nev., on Jan. 27, 2024. (Mario Tama/Getty Images; David Becker/Getty Images)

With Nikki Haley’s withdrawal, there will be no more significantly contested primaries or caucuses—the earliest both parties’ races have been over since something like the current primary-dominated system was put in place in 1972.

The primary results have spotlighted some of both nominees’ weaknesses.

Donald Trump lost high-income, high-educated constituencies, including the entire metro area—aka the Swamp. Many but by no means all Haley votes there were cast by Biden Democrats. Mr. Trump can’t afford to lose too many of the others in target states like Pennsylvania and Michigan.

Majorities and large minorities of voters in overwhelmingly Latino counties in Texas’s Rio Grande Valley and some in Houston voted against Joe Biden, and even more against Senate nominee Rep. Colin Allred (D-Texas).

Returns from Hispanic precincts in New Hampshire and Massachusetts show the same thing. Mr. Biden can’t afford to lose too many Latino votes in target states like Arizona and Georgia.

When Mr. Trump rode down that escalator in 2015, commentators assumed he’d repel Latinos. Instead, Latino voters nationally, and especially the closest eyewitnesses of Biden’s open-border policy, have been trending heavily Republican.

High-income liberal Democrats may sport lawn signs proclaiming, “In this house, we believe ... no human is illegal.” The logical consequence of that belief is an open border. But modest-income folks in border counties know that flows of illegal immigrants result in disorder, disease, and crime.

There is plenty of impatience with increased disorder in election returns below the presidential level. Consider Los Angeles County, America’s largest county, with nearly 10 million people, more people than 40 of the 50 states. It voted 71 percent for Mr. Biden in 2020.

Current returns show county District Attorney George Gascon winning only 21 percent of the vote in the nonpartisan primary. He’ll apparently face Republican Nathan Hochman, a critic of his liberal policies, in November.

Gascon, elected after the May 2020 death of counterfeit-passing suspect George Floyd in Minneapolis, is one of many county prosecutors supported by billionaire George Soros. His policies include not charging juveniles as adults, not seeking higher penalties for gang membership or use of firearms, and bringing fewer misdemeanor cases.

The predictable result has been increased car thefts, burglaries, and personal robberies. Some 120 assistant district attorneys have left the office, and there’s a backlog of 10,000 unprosecuted cases.

More than a dozen other Soros-backed and similarly liberal prosecutors have faced strong opposition or have left office.

St. Louis prosecutor Kim Gardner resigned last May amid lawsuits seeking her removal, Milwaukee’s John Chisholm retired in January, and Baltimore’s Marilyn Mosby was defeated in July 2022 and convicted of perjury in September 2023. Last November, Loudoun County, Virginia, voters (62 percent Biden) ousted liberal Buta Biberaj, who declined to prosecute a transgender student for assault, and in June 2022 voters in San Francisco (85 percent Biden) recalled famed radical Chesa Boudin.

Similarly, this Tuesday, voters in San Francisco passed ballot measures strengthening police powers and requiring treatment of drug-addicted welfare recipients.

In retrospect, it appears the Floyd video, appearing after three months of COVID-19 confinement, sparked a frenzied, even crazed reaction, especially among the highly educated and articulate. One fatal incident was seen as proof that America’s “systemic racism” was worse than ever and that police forces should be defunded and perhaps abolished.

2020 was “the year America went crazy,” I wrote in January 2021, a year in which police funding was actually cut by Democrats in New York, Los Angeles, San Francisco, Seattle, and Denver. A year in which young New York Times (NYT) staffers claimed they were endangered by the publication of Sen. Tom Cotton’s (R-Ark.) opinion article advocating calling in military forces if necessary to stop rioting, as had been done in Detroit in 1967 and Los Angeles in 1992. A craven NYT publisher even fired the editorial page editor for running the article.

Evidence of visible and tangible discontent with increasing violence and its consequences—barren and locked shelves in Manhattan chain drugstores, skyrocketing carjackings in Washington, D.C.—is as unmistakable in polls and election results as it is in daily life in large metropolitan areas. Maybe 2024 will turn out to be the year even liberal America stopped acting crazy.

Chaos and disorder work against incumbents, as they did in 1968 when Democrats saw their party’s popular vote fall from 61 percent to 43 percent.

Views expressed in this article are opinions of the author and do not necessarily reflect the views of The Epoch Times or ZeroHedge.

Tyler Durden Sat, 03/09/2024 - 23:20

Read More

Continue Reading

Government

Veterans Affairs Kept COVID-19 Vaccine Mandate In Place Without Evidence

Veterans Affairs Kept COVID-19 Vaccine Mandate In Place Without Evidence

Authored by Zachary Stieber via The Epoch Times (emphasis ours),

The…

Published

on

Veterans Affairs Kept COVID-19 Vaccine Mandate In Place Without Evidence

Authored by Zachary Stieber via The Epoch Times (emphasis ours),

The U.S. Department of Veterans Affairs (VA) reviewed no data when deciding in 2023 to keep its COVID-19 vaccine mandate in place.

Doses of a COVID-19 vaccine in Washington in a file image. (Jacquelyn Martin/Pool/AFP via Getty Images)

VA Secretary Denis McDonough said on May 1, 2023, that the end of many other federal mandates “will not impact current policies at the Department of Veterans Affairs.”

He said the mandate was remaining for VA health care personnel “to ensure the safety of veterans and our colleagues.”

Mr. McDonough did not cite any studies or other data. A VA spokesperson declined to provide any data that was reviewed when deciding not to rescind the mandate. The Epoch Times submitted a Freedom of Information Act for “all documents outlining which data was relied upon when establishing the mandate when deciding to keep the mandate in place.”

The agency searched for such data and did not find any.

The VA does not even attempt to justify its policies with science, because it can’t,” Leslie Manookian, president and founder of the Health Freedom Defense Fund, told The Epoch Times.

“The VA just trusts that the process and cost of challenging its unfounded policies is so onerous, most people are dissuaded from even trying,” she added.

The VA’s mandate remains in place to this day.

The VA’s website claims that vaccines “help protect you from getting severe illness” and “offer good protection against most COVID-19 variants,” pointing in part to observational data from the U.S. Centers for Disease Control and Prevention (CDC) that estimate the vaccines provide poor protection against symptomatic infection and transient shielding against hospitalization.

There have also been increasing concerns among outside scientists about confirmed side effects like heart inflammation—the VA hid a safety signal it detected for the inflammation—and possible side effects such as tinnitus, which shift the benefit-risk calculus.

President Joe Biden imposed a slate of COVID-19 vaccine mandates in 2021. The VA was the first federal agency to implement a mandate.

President Biden rescinded the mandates in May 2023, citing a drop in COVID-19 cases and hospitalizations. His administration maintains the choice to require vaccines was the right one and saved lives.

“Our administration’s vaccination requirements helped ensure the safety of workers in critical workforces including those in the healthcare and education sectors, protecting themselves and the populations they serve, and strengthening their ability to provide services without disruptions to operations,” the White House said.

Some experts said requiring vaccination meant many younger people were forced to get a vaccine despite the risks potentially outweighing the benefits, leaving fewer doses for older adults.

By mandating the vaccines to younger people and those with natural immunity from having had COVID, older people in the U.S. and other countries did not have access to them, and many people might have died because of that,” Martin Kulldorff, a professor of medicine on leave from Harvard Medical School, told The Epoch Times previously.

The VA was one of just a handful of agencies to keep its mandate in place following the removal of many federal mandates.

“At this time, the vaccine requirement will remain in effect for VA health care personnel, including VA psychologists, pharmacists, social workers, nursing assistants, physical therapists, respiratory therapists, peer specialists, medical support assistants, engineers, housekeepers, and other clinical, administrative, and infrastructure support employees,” Mr. McDonough wrote to VA employees at the time.

This also includes VA volunteers and contractors. Effectively, this means that any Veterans Health Administration (VHA) employee, volunteer, or contractor who works in VHA facilities, visits VHA facilities, or provides direct care to those we serve will still be subject to the vaccine requirement at this time,” he said. “We continue to monitor and discuss this requirement, and we will provide more information about the vaccination requirements for VA health care employees soon. As always, we will process requests for vaccination exceptions in accordance with applicable laws, regulations, and policies.”

The version of the shots cleared in the fall of 2022, and available through the fall of 2023, did not have any clinical trial data supporting them.

A new version was approved in the fall of 2023 because there were indications that the shots not only offered temporary protection but also that the level of protection was lower than what was observed during earlier stages of the pandemic.

Ms. Manookian, whose group has challenged several of the federal mandates, said that the mandate “illustrates the dangers of the administrative state and how these federal agencies have become a law unto themselves.”

Tyler Durden Sat, 03/09/2024 - 22:10

Read More

Continue Reading

Government

Low Iron Levels In Blood Could Trigger Long COVID: Study

Low Iron Levels In Blood Could Trigger Long COVID: Study

Authored by Amie Dahnke via The Epoch Times (emphasis ours),

People with inadequate…

Published

on

Low Iron Levels In Blood Could Trigger Long COVID: Study

Authored by Amie Dahnke via The Epoch Times (emphasis ours),

People with inadequate iron levels in their blood due to a COVID-19 infection could be at greater risk of long COVID.

(Shutterstock)

A new study indicates that problems with iron levels in the bloodstream likely trigger chronic inflammation and other conditions associated with the post-COVID phenomenon. The findings, published on March 1 in Nature Immunology, could offer new ways to treat or prevent the condition.

Long COVID Patients Have Low Iron Levels

Researchers at the University of Cambridge pinpointed low iron as a potential link to long-COVID symptoms thanks to a study they initiated shortly after the start of the pandemic. They recruited people who tested positive for the virus to provide blood samples for analysis over a year, which allowed the researchers to look for post-infection changes in the blood. The researchers looked at 214 samples and found that 45 percent of patients reported symptoms of long COVID that lasted between three and 10 months.

In analyzing the blood samples, the research team noticed that people experiencing long COVID had low iron levels, contributing to anemia and low red blood cell production, just two weeks after they were diagnosed with COVID-19. This was true for patients regardless of age, sex, or the initial severity of their infection.

According to one of the study co-authors, the removal of iron from the bloodstream is a natural process and defense mechanism of the body.

But it can jeopardize a person’s recovery.

When the body has an infection, it responds by removing iron from the bloodstream. This protects us from potentially lethal bacteria that capture the iron in the bloodstream and grow rapidly. It’s an evolutionary response that redistributes iron in the body, and the blood plasma becomes an iron desert,” University of Oxford professor Hal Drakesmith said in a press release. “However, if this goes on for a long time, there is less iron for red blood cells, so oxygen is transported less efficiently affecting metabolism and energy production, and for white blood cells, which need iron to work properly. The protective mechanism ends up becoming a problem.”

The research team believes that consistently low iron levels could explain why individuals with long COVID continue to experience fatigue and difficulty exercising. As such, the researchers suggested iron supplementation to help regulate and prevent the often debilitating symptoms associated with long COVID.

It isn’t necessarily the case that individuals don’t have enough iron in their body, it’s just that it’s trapped in the wrong place,” Aimee Hanson, a postdoctoral researcher at the University of Cambridge who worked on the study, said in the press release. “What we need is a way to remobilize the iron and pull it back into the bloodstream, where it becomes more useful to the red blood cells.”

The research team pointed out that iron supplementation isn’t always straightforward. Achieving the right level of iron varies from person to person. Too much iron can cause stomach issues, ranging from constipation, nausea, and abdominal pain to gastritis and gastric lesions.

1 in 5 Still Affected by Long COVID

COVID-19 has affected nearly 40 percent of Americans, with one in five of those still suffering from symptoms of long COVID, according to the U.S. Centers for Disease Control and Prevention (CDC). Long COVID is marked by health issues that continue at least four weeks after an individual was initially diagnosed with COVID-19. Symptoms can last for days, weeks, months, or years and may include fatigue, cough or chest pain, headache, brain fog, depression or anxiety, digestive issues, and joint or muscle pain.

Tyler Durden Sat, 03/09/2024 - 12:50

Read More

Continue Reading

Trending