News
Article
Author(s):
The FDA approved the first test to assess opioid addiction risk; Americans are using more cold and allergy medicines to mask any related symptoms to return to work and social gatherings, but may be overtreating themselves in the process; Rite Aid will be banned from using artificial intelligence–powered facial recognition technology for 5 years under a proposed Federal Trade Commission settlement.
The FDA announced Tuesday that it approved the first test to assess opioid addiction risk in certain individuals, according to Reuters. The test, named AvertD, is a prescription-only genetic laboratory test for patients 18 years and older who have not previously used oral opioid painkillers and are being considered for a 4- to 30-day prescription to treat acute pain, like those scheduled to undergo a planned surgical procedure. It involves swabbing a patient’s cheek to collect a DNA sample, which is then used to determine if there is a combination of genetic variants that may be associated with an elevated risk of developing an opioid addiction.
Americans are under pressure to show up for work and social gatherings now that the COVID-19 pandemic has receded, causing them to use cold and allergy medicines to mask any symptoms, with some overtreating themselves in the process, according to Bloomberg. According to research firm NIQ, US sales of upper respiratory OTC medications rose 23% this past year to $11.8 billion compared with sales in 2019 before the pandemic; cold and flu treatments grew faster with a 30% gain. Brands at times encourage consumers to load up on OTC medications to carry on with their days. This may be harming patients’ health as taking too much of any medication is frowned upon by doctors, and doing so increases the risk of adverse effects like high blood pressure or fatigue.
The Federal Trade Commission (FTC) announced Tuesday that pharmacy retail chain Rite Aid will be banned from using artificial intelligence (AI)–powered facial recognition technology for 5 years under a proposed settlement, according to Axios. In a complaint, the FTC accused Rite Aid of violating a 2010 data security order as it failed to implement reasonable procedures from 2012 to 2020 to prevent harm to consumers when using AI-based facial recognition; the FTC claimed Rite Aid used facial recognition technology to identify customers they suspected of problematic behavior, which disproportionately targeted people of color and subjected them to embarrassment and harassment. Consequently, the FTC’s proposed order would require Rite Aid to implement comprehensive safeguards to prevent future harm.