Computer

L.A. County Sues Pepsi and Coca-Cola Over Their Role in the Plastic Pollution Crisis

Slashdot - Mon, 2024-11-04 13:34
An anonymous reader shared this report from the Los Angeles Times: Los Angeles County has filed suit against the world's largest beverage companies — Coca-Cola and Pepsi — claiming the soda and drink makers lied to the public about the effectiveness of plastic recycling and, as a result, left county residents and ecosystems choking in discarded plastic... The Los Angeles County suit alleges — in a vein similar to that of [California attorney general] Bonta's suit against Exxon Mobil — that the global beverage companies misrepresented the environmental impact of their plastic bottles, "despite knowing that plastics cannot be readily disposed of without associated environmental impacts." "Coke and Pepsi need to stop the deception and take responsibility for the plastic pollution problems" their products are causing, said Los Angeles County Board of Supervisors Chair Lindsey P. Horvath... Currently, just 9% of the world's plastics are recycled. The rest ends up being incinerated, sent to landfills, or discarded on the landscape, where they are often flushed into rivers or out to sea. At the same time, there is growing concern about the health and environmental consequences of microplastics — the bits of degraded plastic that slough off as the product ages, or is used, or washed. The tiny particles have been detected in every ecosystem on the planet that has been surveyed, as well as nearly every living organism examined... According to the county's statement, the two companies have consistently ranked as the world's "top plastic polluters...." The beverage maker lawsuit was filed in Los Angeles Superior Court by County Counsel Dawyn R. Harrison on behalf of the people of the state of California... "The goal of this lawsuit is to stop the unfair and illegal conduct, to address the marketing practices that deceive consumers, and to force these businesses to change their practices to reduce the plastic pollution problem in the County and in California," Harrison said in a statement. "My office is committed to protecting the public from deceptive business practices and holding these companies accountable for their role in the plastic pollution crisis."

Read more of this story at Slashdot.

Categories: Computer, News

What Happened After Remote Workers Were Offered $10,000 to Move to Tulsa?

Slashdot - Mon, 2024-11-04 09:34
Five years ago remote workers were offered $10,0000 to move to Tulsa, Oklahoma for at least a year. Since then roughly 3,300 have accepted the offer, according to the New York TImes. [Alternate URL here.] But more importantly, now researchers are looking at the results: Their research, released this month, surveyed 1,248 people — including 411 who had participated in Tulsa Remote and others who were accepted but didn't move or weren't accepted but had applied to the program — and found that remote workers who moved to Tulsa saved an average of $25,000 more on annual housing costs than the group that was chosen but didn't move... Nearly three-quarters of participants who have completed the program are still living in Tulsa. The program brings them together for farm-to-table dinners, movie nights and local celebrity lectures to help build community, given that none have offices to commute to. The article says every year the remote workers contribute $14.9 million in state income taxes and $5.8 million in sales taxes (more than offsetting the $33 million spent over the last five years). And additional benefits could be even greater. "We know that for every dollar we've spent on the incentive, there's been about a $13 return on that investment to the city," the program's managing director told Fortune — pointing out that the remote workers have an average salary of $100,000. (500 of the 3,300 even bought homes...) The Tulsa-based George Kaiser Family Foundation — which provides the $10,000 awards — told the New York Times it will continue funding the program "so long as it demonstrates to be a community-enhancing opportunity." And with so much of the population now able to work remotely, the lead author on the latest study adds that "Every heartland mayor should pay attention to this..."

Read more of this story at Slashdot.

Categories: Computer, News

CodeSOD: A Matter of Understanding

The Daily WTF - Mon, 2024-11-04 07:30

For years, Victoria had a co-worker who "programmed by Google Search"; they didn't understand how anything worked, they simply plugged their problem into Google search and then copy/pasted and edited until they got code that worked. For this developer, I'm sure ChatGPT has been a godsend, but this code predates its wide use. It's pure "Googlesauce".

StringBuffer stringBuffer = new StringBuffer(); stringBuffer.append("SELECT * FROM TABLE1 WHERE COLUMN1 = 1 WITH UR"); String sqlStr = stringBuffer.toString(); ps = getConnection().prepareStatement(sqlStr); ps.setInt(1, code); rs = ps.executeQuery(); while (rs.next()) { count++; }

The core of this WTF isn't anything special- instead of running a SELECT COUNT they run a SELECT and then loop over the results to get the count. But it's all the little details in here which make it fun.

They start by using a StringBuffer to construct their query- not a horrible plan when the query is long, but this is just a single, simple, one-line query. The query contains a WITH clause, but it's in the wrong spot. Then they prepareStatement it, which does nothing, since this query doesn't contain any parameters (and also, isn't syntactically valid). Once it's prepared, they set the non-existent parameter 1 to a value- this operation will throw an exception because there are no parameters in the query.

Finally, they loop across the results to count.

The real WTF is that this code ended up in the code base, somehow. The developer said, "Yes, this seems good, I'll check in this non-functional blob that I definitely don't understand," and then there were no protections in place to keep that from happening. Now it falls to more competent developers, like Victoria, to clean up after this co-worker.

[Advertisement] Utilize BuildMaster to release your software with confidence, at the pace your business demands. Download today!
Categories: Computer

Python Overtakes JavaScript on GitHub, Annual Survey Finds

Slashdot - Mon, 2024-11-04 06:34
GitHub released its annual "State of the Octoverse" report this week. And while "Systems programming languages, like Rust, are also on the rise... Python, JavaScript, TypeScript, and Java remain the most widely used languages on GitHub." In fact, "In 2024, Python overtook JavaScript as the most popular language on GitHub." They also report usage of Jupyter Notebooks "skyrocketed" with a 92% jump in usage, which along with Python's rise seems to underscore "the surge in data science and machine learning on GitHub..." We're also seeing increased interest in AI agents and smaller models that require less computational power, reflecting a shift across the industry as more people focus on new use cases for AI... While the United States leads in contributions to generative AI projects on GitHub, we see more absolute activity outside the United States. In 2024, there was a 59% surge in the number of contributions to generative AI projects on GitHub and a 98% increase in the number of projects overall — and many of those contributions came from places like India, Germany, Japan, and Singapore... Notable growth is occurring in India, which is expected to have the world's largest developer population on GitHub by 2028, as well as across Africa and Latin America... [W]e have seen greater growth outside the United States every year since 2013 — and that trend has sped up over the past few years. Last year they'd projected India would have the most developers on GitHub #1 by 2027, but now believe it will happen a year later. This year's top 10? 1. United States 2. India 3. China 4. Brazil 5. United Kingdom 6. Russia 7. Germany 8. Indonesia 9. Japan 10. Canada Interestingly, the UK's population ranks #21 among countries of the world, while Germany ranks #19, and Canada ranks #36.) GitHub's announcement argues the rise of non-English, high-population regions "is notable given that it is happening at the same time as the proliferation of generative AI tools, which are increasingly enabling developers to engage with code in their natural language." And they offer one more data point: GitHub's For Good First Issue is a curated list of Digital Public Goods that need contributors, connecting those projects with people who want to address a societal challenge and promote sustainable development... Significantly, 34% of contributors to the top 10 For Good Issue projects... made their first contribution after signing up for GitHub Copilot. There's now 518 million projects on GitHub — with a year-over-year growth of 25%...

Read more of this story at Slashdot.

Categories: Computer, News

Will Charging Cables Ever Have a Single Standardzed Port?

Slashdot - Mon, 2024-11-04 03:38
The Atlantic complains that our chaos of different plug types "was supposed to end, with USB-C as our savior." But part of the problem is what they call "the second circle of our cable hell: My USB-C may not be the same as yours. And the USB-C you bought two years ago may not be the same as the one you got today. And that means it might not do what you now assume it can." A lack of standardization is not the problem here. The industry has designed, named, and rolled out a parade of standards that pertain to USB and all its cousins. Some of those standards live inside other standards. For example, USB 3.2 Gen 1 is also known as USB 3.0, even though it's numbered 3.2. (What? Yes.) And both of these might be applied to cables with USB-A connectors, or USB-B, or USB-Micro B, or — why not? — USB-C. The variations stretch on and on toward the horizon. Hope persists that someday, eventually, this hell can be escaped — and that, given sufficient standardization, regulatory intervention, and consumer demand, a winner will emerge in the battle of the plugs. But the dream of having a universal cable is always and forever doomed, because cables, like humankind itself, are subject to the curse of time, the most brutal standard of them all. At any given moment, people use devices they bought last week alongside those they've owned for years; they use the old plugs in rental cars or airport-gate-lounge seats; they buy new gadgets with even better capabilities that demand new and different (if similar-looking) cables. Even if Apple puts a USB-C port in every new device, and so does every other manufacturer, that doesn't mean that they will do everything you will expect cables to do in the future. Inevitably, you will find yourself needing new ones. Back in 1998, the New York Times told me, "If you make your move to U.S.B. now, you can be sure that your new devices will have a port to plug into." I was ready! I'm still ready. But alas, a port to plug into has never been enough. Obligatory XKCD.

Read more of this story at Slashdot.

Categories: Computer, News

Researchers Develop New Method That Tricks Cancer Cells Into Killing Themselves

Slashdot - Mon, 2024-11-04 01:39
Our bodies divest themselves of 60 billion cells every day through a natural process called "apoptosis". So Stanford medicine researchers are developing a new approach to cancer therapy that could "trick cancer cells into disposing of themselves," according to announcement from Stanford's medical school: Their method accomplishes this by artificially bringing together two proteins in such a way that the new compound switches on a set of cell death genes... One of these proteins, BCL6, when mutated, drives the blood cancer known as diffuse large cell B-cell lymphoma... [It] sits on DNA near apoptosis-promoting genes and keeps them switched off, helping the cancer cells retain their signature immortality. The researchers developed a molecule that tethers BCL6 to a protein known as CDK9, which acts as an enzyme that catalyzes gene activation, in this case, switching on the set of apoptosis genes that BCL6 normally keeps off. "The idea is, Can you turn a cancer dependency into a cancer-killing signal?" asked Nathanael Gray, PhD, co-senior author with Crabtree, the Krishnan-Shah Family Professor and a chemical and systems biology professor. "You take something that the cancer is addicted to for its survival and you flip the script and make that be the very thing that kills it...." When the team tested the molecule in diffuse large cell B-cell lymphoma cells in the lab, they found that it indeed killed the cancer cells with high potency. They also tested the molecule in healthy mice and found no obvious toxic side effects, even though the molecule killed off a specific category of of the animals' healthy B cells, a kind of immune cell, which also depend on BCL6. They're now testing the compound in mice with diffuse large B-cell lymphoma to gauge its ability to kill cancer in a living animal. Because the technique relies on the cells' natural supply of BCL6 and CDK9 proteins, it seems to be very specific for the lymphoma cells — the BCL6 protein is found only in this kind of lymphoma cell and in one specific kind of B cell. The researchers tested the molecule in 859 different kinds of cancer cells in the lab; the chimeric compound killed only diffuse large cell B-cell lymphoma cells. Scientists have been trying to shut down cancer-driving proteins, one of the researchers says, but instead, "we're trying to use them to turn signaling on that, we hope, will prove beneficial for treatment." The two researchers have co-founded the biotech startup Shenandoah Therapeutics, which "aims to further test this molecule and a similar, previously developed molecule," according to the article, "in hopes of gathering enough pre-clinical data to support launching clinical trials of the compounds. "They also plan to build similar molecules that could target other cancer-driving proteins..."

Read more of this story at Slashdot.

Categories: Computer, News

How a Slice of Cheese Almost Derailed Europe's Most Important Rocket Test

Slashdot - Mon, 2024-11-04 00:39
Long-time Slashdot reader schwit1 shared this report from the blog Interesting Engineering: A team of students made history this month by performing Europe's first rocket hop test. Those who have followed SpaceX's trajectory will know hop tests are a vital stepping stone for a reusable rocket program, as they allow engineers to test their rocket's landing capabilities. Impressively, no private company or space agency in Europe had ever performed a rocket hop test before. Essentially, a group of students performed one of the most important rocket tests in the history of European rocketry. However, the remarkable nature of this story doesn't end there. Amazingly, the whole thing was almost derailed by a piece of cheese. A slice of Gruyère the team strapped to their rocket's landing legs almost caused the rocket to spin out of control. Thankfully, disaster was averted, and the historic hopper didn't end up as rocket de-Brie.

Read more of this story at Slashdot.

Categories: Computer, News

Leaked Training Shows Doctors In New York's Biggest Hospital System Using AI

Slashdot - Sun, 2024-11-03 23:39
Slashdot reader samleecole shared this report from 404 Media: Northwell Health, New York State's largest healthcare provider, recently launched a large language model tool that it is encouraging doctors and clinicians to use for translation, sensitive patient data, and has suggested it can be used for diagnostic purposes, 404 Media has learned. Northwell Health has more than 85,000 employees. An internal presentation and employee chats obtained by 404 Media shows how healthcare professionals are using LLMs and chatbots to edit writing, make hiring decisions, do administrative tasks, and handle patient data. In the presentation given in August, Rebecca Kaul, senior vice president and chief of digital innovation and transformation at Northwell, along with a senior engineer, discussed the launch of the tool, called AI Hub, and gave a demonstration of how clinicians and researchers—or anyone with a Northwell email address—can use it... AI Hub can be used for "clinical or clinical adjacent" tasks, as well as answering questions about hospital policies and billing, writing job descriptions and editing writing, and summarizing electronic medical record excerpts and inputting patients' personally identifying and protected health information. The demonstration also showed potential capabilities that included "detect pancreas cancer," and "parse HL7," a health data standard used to share electronic health records. The leaked presentation shows that hospitals are increasingly using AI and LLMs to streamlining administrative tasks, and shows that some are experimenting with or at least considering how LLMs would be used in clinical settings or in interactions with patients.

Read more of this story at Slashdot.

Categories: Computer, News

New Study Suggests Oceans Absorb More CO2 Than Previously Thought

Slashdot - Sun, 2024-11-03 22:39
Long-time Slashdot reader schwit1 shared this story from SciTechDaily: New research confirms that subtle temperature differences at the ocean surface, known as the "ocean skin," increase carbon dioxide absorption. This discovery, based on precise measurements, suggests global oceans absorb 7% more CO2 than previously thought, aiding climate understanding and carbon assessments... Until now, global estimates of air-sea CO2 fluxes typically ignore the importance of temperature differences in the near-surface layer... Dr Gavin Tilstone, from Plymouth Marine Laboratory (PML), said: "This discovery highlights the intricacy of the ocean's water column structure and how it can influence CO2 draw-down from the atmosphere. Understanding these subtle mechanisms is crucial as we continue to refine our climate models and predictions. It underscores the ocean's vital role in regulating the planet's carbon cycle and climate."

Read more of this story at Slashdot.

Categories: Computer, News

After Silence, NASA's Voyager Finally Phones Home - With a Device Unused Since 1981

Slashdot - Sun, 2024-11-03 21:39
Somewhere off in interstellar space, 15.4 billion miles away from Earth, NASA's 47-year-old Voyager "recently went quiet," reports Mashable. The probe "shut off its main radio transmitter for communicating with mission control..." Voyager's problem began on October 16, when flight controllers sent the robotic explorer a somewhat routine command to turn on a heater. Two days later, when NASA expected to receive a response from the spacecraft, the team learned something tripped Voyager's fault protection system, which turned off its X-band transmitter. By October 19, communication had altogether stopped. The flight team was not optimistic. However, Voyager 1 was equipped with a backup that relies on a different, albeit significantly fainter, frequency. No one knew if the second radio transmitter could still work, given the aging spacecraft's extreme distance. Days later, engineers with the Deep Space Network, a system of three enormous radio dish arrays on Earth, found the signal whispering back over the S-band transmitter. The device hadn't been used since 1981, according to NASA. "The team is now working to gather information that will help them figure out what happened and return Voyager 1 to normal operations," NASA said in a recent mission update. It's been more than 12 years since Voyager entered interstellar space, the article points out. And interstellar space "is a high-radiation environment that nothing human-made has ever flown in before. "That means the only thing the teams running the old probes can count on are surprises."

Read more of this story at Slashdot.

Categories: Computer, News

Millions of U.S. Cellphones Could Be Vulnerable to Chinese Government Surveillance

Slashdot - Sun, 2024-11-03 20:27
Millions of U.S. cellphone users could be vulnerable to Chinese government surveillance, warns a Washington Post columnist, "on the networks of at least three major U.S. carriers." They cite six current or former senior U.S. officials, all of whom were briefed about the attack by the U.S. intelligence community. The Chinese hackers, who the United States believes are linked to Beijing's Ministry of State Security, have burrowed inside the private wiretapping and surveillance system that American telecom companies built for the exclusive use of U.S. federal law enforcement agencies — and the U.S. government believes they likely continue to have access to the system.... The U.S. government and the telecom companies that are dealing with the breach have said very little publicly about it since it was first detected in August, leaving the public to rely on details trickling out through leaks... The so-called lawful-access system breached by the Salt Typhoon hackers was established by telecom carriers after the terrorist attacks of Sept. 11, 2001, to allow federal law enforcement officials to execute legal warrants for records of Americans' phone activity or to wiretap them in real time, depending on the warrant. Many of these cases are authorized under the Foreign Intelligence Surveillance Act (FISA), which is used to investigate foreign spying that involves contact with U.S. citizens. The system is also used for legal wiretaps related to domestic crimes. It is unknown whether hackers were able to access records about classified wiretapping operations, which could compromise federal criminal investigations and U.S. intelligence operations around the world, multiple officials told me. But they confirmed the previous reporting that hackers were able to both listen in on phone calls and monitor text messages. "Right now, China has the ability to listen to any phone call in the United States, whether you are the president or a regular Joe, it makes no difference," one of the hack victims briefed by the FBI told me. "This has compromised the entire telecommunications infrastructure of this country." The Wall Street Journal first reported on Oct. 5 that China-based hackers had penetrated the networks of U.S. telecom providers and might have penetrated the system that telecom companies operate to allow lawful access to wiretapping capabilities by federal agencies... [After releasing a short statement], the FBI notified 40 victims of Salt Typhoon, according to multiple officials. The FBI informed one person who had been compromised that the initial group of identified targets included six affiliated with the Trump campaign, this person said, and that the hackers had been monitoring them as recently as last week... "They had live audio from the president, from JD, from Jared," the person told me. "There were no device compromises, these were all real-time interceptions...." [T]he duration of the surveillance is believed to date back to last year. Several officials told the columnist that the cyberattack also targetted senior U.S. government officials and top business leaders — and that even more compromised targets are being discovered. At this point, "Multiple officials briefed by the investigators told me the U.S. government does not know how many people were targeted, how many were actively surveilled, how long the Chinese hackers have been in the system, or how to get them out." But the article does include this quote from U.S. Senate Intelligence Committee chairman Mark Warner. "It is much more serious and much worse than even what you all presume at this point." One U.S. representative suggested Americans rely more on encrypted apps. The U.S. is already investigating — but while researching the article, the columnist writes, "The National Security Council declined to comment, and the FBI did not respond to a request for comment..." They end with this recommendation. "If millions of Americans are vulnerable to Chinese surveillance, they have a right to know now."

Read more of this story at Slashdot.

Categories: Computer, News

New 'Open Source AI Definition' Criticized for Not Opening Training Data

Slashdot - Sun, 2024-11-03 18:34
Long-time Slashdot reader samj — also a long-time Debian developer — tells us there's some opposition to the newly-released Open Source AI definition. He calls it a "fork" that undermines the original Open Source definition (which was originally derived from Debian's Free Software Guidelines, written primarily by Bruce Perens), and points us to a new domain with a petition declaring that instead Open Source shall be defined "solely by the Open Source Definition version 1.9. Any amendments or new definitions shall only be recognized with clear community consensus via an open and transparent process." This move follows some discussion on the Debian mailing list: Allowing "Open Source AI" to hide their training data is nothing but setting up a "data barrier" protecting the monopoly, disabling anybody other than the first party to reproduce or replicate an AI. Once passed, OSI is making a historical mistake towards the FOSS ecosystem. They're not the only ones worried about data. This week TechCrunch noted an August study which "found that many 'open source' models are basically open source in name only. The data required to train the models is kept secret, the compute power needed to run them is beyond the reach of many developers, and the techniques to fine-tune them are intimidatingly complex. Instead of democratizing AI, these 'open source' projects tend to entrench and expand centralized power, the study's authors concluded." samj shares the concern about training data, arguing that training data is the source code and that this new definition has real-world consequences. (On a personal note, he says it "poses an existential threat to our pAI-OS project at the non-profit Kwaai Open Source Lab I volunteer at, so we've been very active in pushing back past few weeks.") And he also came up with a detailed response by asking ChatGPT. What would be the implications of a Debian disavowing the OSI's Open Source AI definition? ChatGPT composed a 7-point, 14-paragraph response, concluding that this level of opposition would "create challenges for AI developers regarding licensing. It might also lead to a fragmentation of the open-source community into factions with differing views on how AI should be governed under open-source rules." But "Ultimately, it could spur the creation of alternative definitions or movements aimed at maintaining stricter adherence to the traditional tenets of software freedom in the AI age." However the official FAQ for the new Open Source AI definition argues that training data "does not equate to a software source code." Training data is important to study modern machine learning systems. But it is not what AI researchers and practitioners necessarily use as part of the preferred form for making modifications to a trained model.... [F]orks could include removing non-public or non-open data from the training dataset, in order to train a new Open Source AI system on fully public or open data... [W]e want Open Source AI to exist also in fields where data cannot be legally shared, for example medical AI. Laws that permit training on data often limit the resharing of that same data to protect copyright or other interests. Privacy rules also give a person the rightful ability to control their most sensitive information — like decisions about their health. Similarly, much of the world's Indigenous knowledge is protected through mechanisms that are not compatible with later-developed frameworks for rights exclusivity and sharing. Read on for the rest of their response...

Read more of this story at Slashdot.

Categories: Computer, News

Invisible, Super Stretchy Nanofibers Discovered In Natural Spider Silk

Slashdot - Sun, 2024-11-03 17:34
Long-time Slashdot reader yet-another-lobbyist writes: Phys.org has an article on the recent discovery of super stretchy nanofibers in natural spider silk! The thinnest natural spider silk nanofibrils ever seen are only a few molecular layers thin, about 5 nm. They are too thin to be seen even with a very powerful optical microscope. Researchers used atomic force microscopy (AFM) not only to visualize them, but also to probe their stretchiness and strength. Even the original article is available without a paywall. Mechanical tests of molecularly thin materials — pretty cool! The doctoral candidate's advisor thought it would be impossible to perform the measurements, according to the article, which quotes him as saying "It's actually kind of crazy to think that it's even possible.... We humans think we're so great and we can invent things, but if you just take a step outside, you find so many things that are more exciting." That advisor — long term spider-silk researcher of Hannes Schniepp (also a co-author on the paper) — adds that the tip of the needle was so sharp, its end was only a few atoms thick. "You would not see the end of it in the best optical microscope. It will just disappear because it's so small that you can't even see it. It's probably one of the highest developed technologies on the planet." If humans find a way to replicate the structure of spider silk, it could be manufactured for use in practical applications. "You could make a super bungee cord from it," said Schniepp. "Or a shield around a structure where you have something incoming at high velocity and you need to absorb a lot of energy. Things like that."

Read more of this story at Slashdot.

Categories: Computer, News

Can Heat Pumps Still Save the Planet from Climate Change?

Slashdot - Sun, 2024-11-03 16:34
"One technology critical to fighting climate change is lagging," reports the Washington Post, "thanks to a combination of high interest rates, rising costs, misinformation and the cycle of home construction. Adoption of heat pumps, one of the primary ways to cut emissions from buildings, has slowed in the United States and stalled in Europe, endangering the switch to clean energy. "Heat pump investment in the United States has dropped by 4 percent in the past two years, even as sales of EVs have almost doubled, according to data from MIT and the Rhodium Group. In 13 European countries, heat pump sales dropped nearly in half in the first half of 2024, putting the European Union off-track for its climate goals." "Many many markets are falling," said Paul Kenny, the director general of the European Heat Pump Association. "It takes time to change people's minds about a heating system." Heat pumps — essentially air conditioners that can also work in reverse, heating a space as well as cooling it — are crucial to making buildings more climate-friendly. Around 60 percent of American homes are still heated with furnaces running on oil, natural gas, or even propane; to cut emissions from homes, all American houses and apartments will need to be powered by electricity... In the United States, experts point to lags in construction, high interest rates, and general belt-tightening from inflation... [Cora Wyent, director of research for the electrification advocacy group Rewiring America] added, heat pumps are still growing as a share of overall heating systems, gaining ground on gas furnaces. In 2023, heat pumps made up 55 percent of all heating systems sold, while gas furnaces made up just 45 percent. "Heat pumps are continuing to increase their total market share," she said. Homeowners may also run into trouble when trying to find contractors to install heat pumps. Barton James, the president and CEO of the Air Conditioning Contractors of America, says many contractors don't have training on how to properly install heat pumps; if they install them incorrectly, the ensuing problems can sour consumers on the technology... In the United States, low gas prices also make the economics of heat pumps more challenging. Gas is around three times cheaper than electricity — while heat pumps make up most of that ground with efficiency, they aren't the most cost-effective option for every household. The Post also spoke to the manager for the carbon-free buildings team at the clean energy think tank RMI. They pointed out that heating systems need to be replaced roughly every 15 years — and the next cycle doesn't start until 2035. The article concludes that "even with government policies and subsidies, many parts of the move to clean energy will require individual people to make changes to their lives. According to the International Energy Agency, the number of heat pumps will have to triple by 2030 to stay on track with climate goals. The only way to do that, experts say, is if incentives, personal beliefs, and technology all align."

Read more of this story at Slashdot.

Categories: Computer, News

AI Bug Bounty Program Finds 34 Flaws in Open-Source Tools

Slashdot - Sun, 2024-11-03 13:34
Slashdot reader spatwei shared this report from SC World: Nearly three dozen flaws in open-source AI and machine learning (ML) tools were disclosed Tuesday as part of [AI-security platform] Protect AI's huntr bug bounty program. The discoveries include three critical vulnerabilities: two in the Lunary AI developer toolkit [both with a CVSS score of 9.1] and one in a graphical user interface for ChatGPT called Chuanhu Chat. The October vulnerability report also includes 18 high-severity flaws ranging from denial-of-service to remote code execution... Protect AI's report also highlights vulnerabilities in LocalAI, a platform for running AI models locally on consumer-grade hardware, LoLLMs, a web UI for various AI systems, LangChain.js, a framework for developing language model applications, and more. In the article, Protect AI's security researchers point out that these open-source tools are "downloaded thousands of times a month to build enterprise AI Systems." The three critical vulnerabilties have already been addressed by their respective companies, according to the article.

Read more of this story at Slashdot.

Categories: Computer, News

What's Worse Than Setting Clocks Back an Hour? Permanent Daylight Savings Time

Slashdot - Sun, 2024-11-03 08:34
"It's that time again," writes USA Today, noting that Sunday morning millions of Americans (along with millions more in Canada, Europe, parts of Australia, and Chile) "will set their clocks back an hour, and many will renew their twice-yearly calls to put an end to the practice altogether..." Experts say the time changes are detrimental to health and safety, but agree that the answer isn't permanent DST. "The medical and scientific communities are unified ... that permanent standard time is better for human health," said Erik Herzog, a professor of biology and neuroscience at Washington University in St. Louis and the former president of the Society for Research on Biological Rhythms... Springing forward an hour in March is harder on us than falling back in November. The shift in spring is associated with an increase in heart attacks, and car accident rates also go up for a few days after, he said. But the answer isn't permanent daylight saving time, according to Herzog, who said that could be even worse for human health than the twice-yearly changes. By looking at studies of people who live at the easternmost edge of time zones (whose experience is closest to standard time) and people who live at the westernmost edge (more like daylight saving time), scientists can tell that health impacts of earlier sunrises and sunsets are much better. Waking up naturally with the sun is far better for our bodies than having to rely on alarm clocks to wake up in the dark, he said. Herzog said Florida, where [Senator Marco] Rubio has championed the Sunlight Protection Act, is much less impacted by the negative impacts of daylight saving time because it's as far east and south as you can get in the U.S., while people in a state like Minnesota would have much more time in the dark in the morning. The article also reminds U.S. readers that "No state can adopt permanent daylight saving time unless U.S. Congress passes a law to authorize it first." Nevertheless... Oklahoma became the most recent state to pass a measure authorizing permanent daylight saving time, pending Congressional approval, in April. Nineteen other states have passed laws or resolutions to move toward daylight saving time year-round, if Congress were ever to allow it, according to the National Conference of State Legislatures... Only two states and some territories never have to set their clocks forward or backward... [Hawaii and Arizona, except for the Navajo Nation.]

Read more of this story at Slashdot.

Categories: Computer, News

ASWF: the Open Source Foundation Run By the Folks Who Give Out Oscars

Slashdot - Sun, 2024-11-03 05:34
This week's Ubuntu Summit 2024 was attended by Lproven (Slashdot reader #6,030). He's also a FOSS correspondent for the Register, where he's filed this report: One of the first full-length sessions was presented by David Morin, executive director of the Academy Software Foundation, introducing his organization in a talk about Open Source Software for Motion Pictures. Morin linked to the Visual Effects Society's VFX/Animation Studio Workstation Linux Report, highlighting the market share pie-chart, showing Rocky Linux 9 with at some 58 percent and the RHELatives in general at 90 percent of the market. Ubuntu 22 and 24 — the report's nomenclature, not this vulture's — got just 10.5 percent. We certainly didn't expect to see that at an Ubuntu event, with the latest two versions of Rocky Linux taking 80 percent of the studio workstation market... What also struck us over the next three quarters of an hour is that Linux and open source in general seem to be huge components of the movie special effects industry — to an extent that we had not previously realized. There's a "sizzle reel" showing examples of how major motion pictures used OpenColorIO, an open-source production tool for syncing color representations originally developed by Sony Pictures Imageworks. That tool is hosted by a collaboration between the Linux Foundation with the Science and Technology Council of the Academy of Motion Picture Arts and Sciences (the "Academy" of the Academy Awards). The collaboration — which goes by the name of the Academy Software Foundation — hosts 14 different projects The ASWF hasn't been around all that long — it was only founded in 2018. Despite the impact of the COVID pandemic, by 2022 it had achieved enough to fill a 45-page history called Open Source in Entertainment [PDF]. Morin told the crowd that it runs events, provides project marketing and infrastructure, as well as funding, training and education, and legal assistance. It tries to facilitate industry standards and does open source evangelism in the industry. An impressive list of members — with 17 Premier companies, 16 General ones, and another half a dozen Associate members — shows where some of the money comes from. It's a big list of big names. [Adobe, AMD, AWS, Autodesk...] The presentation started with OpenVBD, a C++ library developed and donated by Dreamworks for working with three-dimensional voxel-based shapes. (In 2020 they created this sizzle reel, but this year they've unveiled a theme song.) Also featured was OpenEXR, originally developed at Industrial Light and Magic and sourced in 1999. (The article calls it "a specification and reference implementation of the EXR file format — a losslessly compressed image storage format for moving images at the highest possible dynamic range.") "For an organization that is not one of the better-known ones in the FOSS space, we came away with the impression that the ASWF is busy," the article concludes. (Besides running Open Source Days and ASWF Dev Days, it also hosts several working groups like the Language Interop Project works on Rust bindings and the Continuous Integration Working Group on CI tools, There's generally very little of the old razzle-dazzle in the Linux world, but with the demise of SGI as the primary maker of graphics workstations — its brand now absorbed by Hewlett Packard Enterprise — the visual effects industry moved to Linux and it's doing amazing things with it. And Kubernetes wasn't even mentioned once.

Read more of this story at Slashdot.

Categories: Computer, News

The 'Passive Housing' Trend is Booming

Slashdot - Sun, 2024-11-03 02:34
The Washington Post reports that a former Etsy CEO remodeled their home into what's known as a passive house. It's "designed to be as energy efficient as possible, typically with top-notch insulation and a perfect seal that prevents outside air from penetrating the home; air flows in and out through filtration and exhaust systems only." Their benefits include protection from pollution and pollen, noise insulation and a stable indoor temperature that minimizes energy needs. That translates to long-term savings on heating and cooling. While the concept has been around for about 50 years, experts say that the United States is on the cusp of a passive house boom, driven by lowered costs, state-level energy code changes and a general greater awareness of — and desire for — more sustainable housing... Massachusetts — which alongside New York and Pennsylvania is one of the leading states in passive house adoption — has 272 passive house projects underway thanks to an incentive program, says Zack Semke [the director of the Passive House Accelerator, a group of industry professionals who aim to spread lessons in passive house building]. Consumer demand for passive houses is also increasing, says Michael Ingui, an architect in New York City and the founder of the Passive House Accelerator... The need to lower our energy footprint is so much more top-of-mind today than it was 10 years ago, Ingui says, and covid taught us about the importance of good ventilation and filtered fresh air. "People are searching for the healthiest house," he says, "and that's a passive house...." These days, new passive houses are usually large, multifamily apartment buildings or high-end single-family homes. But that leaves out a large swath of homeowners in the middle. To widen passive house accessibility to include all types of people and their housing needs, we need better energy codes and even more policies and incentives, says In Cho, a sustainability architect, educator and a co-founder of the nonprofit Passive House for Everyone! Passive houses "can and should serve folks from all socioeconomic backgrounds," she says. Using a one-two punch of mandates for energy efficient buildings and greater awareness to the public, that increased demand for passive houses will lead to more supply, Cho says. And we're already seeing those changes in the market. Take triple-pane windows, for example, which are higher performing and more insulating than their double-pane counterparts. Even just 10 to 20 years ago, the difference in price between the two was high enough to make triple-pane windows cost-prohibitive for a lot of people, Cho says. Over the years, as the benefits of higher performing windows became more well-known, and as cities and states changed their energy codes, more companies began producing better windows. Now they're basically at price parity, she says. If we keep pushing for greater awareness and further policy changes, it's possible that all of the components of passive house buildings could follow that trend. "For large multifamily projects, we're already seeing price parity in some cases, Semke says... "But as it stands, single-family passive houses are still likely to cost a margin more than non-passive houses, he says. This is because price parity is easier to achieve when working at larger scales, but also because many of the housing policies and incentives encouraging passive house buildings are geared toward these larger projects."

Read more of this story at Slashdot.

Categories: Computer, News

Don't Look Now, but GM's EV Sales Are on Fire

Slashdot - Sat, 2024-11-02 23:53
GM's president of global markets says their EV portfolio "is growing faster than the market," according to Investopedia, "because we have an all-electric vehicle for just about everybody, no matter what they like to drive." The headline at Barrons? "Don't Look Now, but GM's EV Sales Are on Fire." GM delivered almost 32,000 all-electric vehicles in the third quarter — a record — and up about 58% from a year earlier. The more affordable Chevy Equinox, which starts at about $35,000 before any federal tax credit, helped boost sales. GM delivered almost 10,000 of the new EVs, up from 1,013 in the second quarter, when they first went on sale. EV penetration of total GM car sales was about almost 5%, up almost two percentage points year over year. EVs accounted for 19.4% of Cadillac sales, up about 11 percentage points year over year. Year to date, GM has delivered just over 70,000 all-electric cars. GM originally planned to manufacture 200,000 EVs in 2024. That still looks aggressive, but the strong third-quarter showing makes 120,000 possible, which would be up almost 60% year over year — a respectable outcome. More important to investors than EV sales right now might be dealer inventories. GM said there were about 627,000 vehicles on dealer lots at the end of September. That's a little better than what Wolfe Research analyst Emmanuel Rosner expected. It indicates GM dealers have roughly 60 days worth of sales on their lots. That's a safe level. Lower dealer inventories reduce presure to reduce prices. They also reduce the need to cut production because dealer lots are full... GM expects to generate a full-year operating profit of about $14 billion. Meanwhile, Stellantis "slashed its financial guidance recently, partly because it needs to dramatically reduce its U.S. inventories," according to the article. For example, its Jeep dealers ended August with roughly 122 days worth of sales on their lots, while its Dodge dealers "had almost 150 days of inventory." And Investopedia argues that while GM's EV sales growth is "soaring," Ford's is showing "only modest gains." [W]hile Ford's overall U.S. sales were 0.7% higher at 504,039, it had just a 12% gain in EVs to 23,509.3 In the second quarter, Ford's EV sales had soared 61% to 23,957. Sales growth was more than three times higher for Ford's hybrid models, with President of Ford Blue and Ford Customer Service Division Andrew Frick arguing that the company has "listened to customers to offer them vehicles with powertrains to meet their specific needs." Ford is hoping to boost EV sales by offering buyers a free home charger and installation.

Read more of this story at Slashdot.

Categories: Computer, News

Is AI-Driven 0-Day Detection Here?

Slashdot - Sat, 2024-11-02 22:52
"AI-driven 0-day detection is here," argues a new blog post from ZeroPath, makers of a GitHub app that "detects, verifies, and issues pull requests for security vulnerabilities in your code." They write that AI-assisted security research "has been quietly advancing" since early 2023, when researchers at the DARPA and ARPA-H's Artificial Intelligence Cyber Challenge demonstrated the first practical applications of LLM-powered vulnerability detection — with new advances continuing. "Since July 2024, ZeroPath's tool has uncovered critical zero-day vulnerabilities — including remote code execution, authentication bypasses, and insecure direct object references — in popular AI platforms and open-source projects." And they ultimately identified security flaws in projects owned by Netflix, Salesforce, and Hulu by "taking a novel approach combining deep program analysis with adversarial AI agents for validation. Our methodology has uncovered numerous critical vulnerabilities in production systems, including several that traditional Static Application Security Testing tools were ill-equipped to find..." TL;DR — most of these bugs are simple and could have been found with a code review from a security researcher or, in some cases, scanners. The historical issue, however, with automating the discovery of these bugs is that traditional SAST tools rely on pattern matching and predefined rules, and miss complex vulnerabilities that do not fit known patterns (i.e. business logic problems, broken authentication flaws, or non-traditional sinks such as from dependencies). They also generate a high rate of false positives. The beauty of LLMs is that they can reduce ambiguity in most of the situations that caused scanners to be either unusable or produce few findings when mass-scanning open source repositories... To do this well, you need to combine deep program analysis with an adversarial agents that test the plausibility of vulnerabilties at each step. The solution ends up mirroring the traditional phases of a pentest — recon, analysis, exploitation (and remediation which is not mentioned in this post)... AI-driven vulnerability detection is moving fast... What's intriguing is that many of these vulnerabilities are pretty straightforward — they could've been spotted with a solid code review or standard scanning tools. But conventional methods often miss them because they don't fit neatly into known patterns. That's where AI comes in, helping us catch issues that might slip through the cracks. "Many vulnerabilities remain undisclosed due to ongoing remediation efforts or pending responsible disclosure processes," according to the blog post, which includes a pie chart showing the biggest categories of vulnerabilities found: 53%: Authorization flaws, including roken access control in API endpoints and unauthorized Redis access and configuration exposure. ("Impact: Unauthorized access, data leakage, and resource manipulation across tenant boundaries.") 26%: File operation issues, including directory traversal in configuration loading and unsafe file handling in upload features. ("Impact: Unauthorized file access, sensitive data exposure, and potential system compromise.") 16%: Code execution vulnerabilities, including command injection in file processing and unsanitized input in system commands. ("Impact: Remote code execution, system command execution, and potential full system compromise.") The company's CIO/cofounder was "former Red Team at Tesla," according to the startup's profile at YCombinator, and earned over $100,000 as a bug-bounty hunter. (And another co-founded is a former Google security engineer.) Thanks to Slashdot reader Mirnotoriety for sharing the article.

Read more of this story at Slashdot.

Categories: Computer, News

Pages