Category Archives: Computer Science

Computer science major applied to 456 internships, got three offers – Business Insider

Angle down icon An icon in the shape of an angle pointing down. "I started applying in July, and soon I hit 200 applications," Oliver Wu, a computer science major at the University of Michigan, told Newsweek. alvarez via Getty Images

Computer science major Oliver Wu says he pulled out all the stops in his quest for a summer internship.

"456 applications, 56 interviews, and 0 sleep in 4 months, all for 1 internship," the University of Michigan junior wrote in a TikTok post published on January 11.

Media not supported by AMP.Tap for full mobile experience.

Wu, whose post has been viewed more than 2.9 million times, told Newsweek that he started applying even before classes resumed in the fall.

"I started applying in July, and soon I hit 200 applications," Wu said, adding that he was making 15 to 20 applications a day before the school term began.

Wu told Newsweek that he did feel "burned out" from the long-drawn search, which saw him receive "hundreds of rejections."

"The hardest part was staying positive and working hard," Wu said.

But Wu remained unfazed and pressed on with his search.

"I did not want to feel regret that I could have tried harder, so I made up my mind to pursue this with everything I had," Wu told the outlet.

Wu's efforts eventually paid off. He secured three offers and will join Ford as an enterprise technology intern this summer.

"I was in class at the time, and I remember stepping out, going into the hallway, and jumping up and down while silently screaming in excitement for around 10 minutes," Wu recounted to Newsweek.

Wu isn't the only one who has had to struggle with a slowing job market for tech.

The industry-wide layoffs, which began in end-2022, don't seem to have abated. Tech companies have continued to ax staff to streamline their operations.

Last week, Google CEO Sundar Pichai told staff to brace for more job cuts this year. Pichai said the layoffs were about "removing layers to simplify execution and drive velocity in some areas."

Wu did not immediately respond to a request for comment from Business Insider sent outside regular business hours.

Read this article:

Computer science major applied to 456 internships, got three offers - Business Insider

OSU-Cascades professor receives $628,000 ODE award to boost K-12 students’ participation in computer science – KTVZ

Only 4% of high schools are taking the classes, 2% students female

BEND, Ore. (KTVZ) -- An Oregon State University-Cascades professor has been assigned a $628,000 award from the Oregon Department of Education to expand universally engaging, culturally appropriate and inclusive computer science education for K-12 students.

The award is part of a statewide plan which aims to ensure computer science education is available to public school students on an equitable basis and to broaden participation for all students by 2027-28. It has been funded in part through a $9.8 million federal grant awarded to the Oregon Department of Education. In addition to OSU-Cascades, the plan was also developed in partnership with the University of Oregon and Portland State University,

Roughly 41% of high schools in Oregon offer students a foundational computer science course, according to the Oregon Department of Education, but just 4% of the states high schoolers are taking one of those classes, and only 2% of those students are female.

An understanding of computing is necessary, foundational knowledge in a world where technology is part of our everyday lives, no matter our career choices, said the award administrator, Jill Hubbard, an associate professor of practice in computer science who also coordinates the degree program at OSU-Cascades.

This award will enhance Oregon K-12 teachers ability to familiarize every student, including underserved students, in their classrooms with computing," she said. "As a byproduct, society will benefit from more diversity in the technology workforce and in approaches to problem-solving using computing.

The award will build support structures that create systematic changes focused on equity and inclusion in computer science education.

According to Hubbard, many computer science teachers are tasked with creating curriculum with little professional development support. Often, courses are tied to specific educators, making it difficult to sustain courses from year to year, especially in rural areas.

The award will fund professional development in Exploring Computer Science, a national, equity-driven and research-based introductory high school curriculum shown to increase participation by underserved students. Hubbard said the award will also provide stipends for teachers to attend summer workshops and professional development throughout their first year of teaching the curriculum.

Additionally, the award will establish regional specialists to support local professional development communities of computer science teachers. It will also fund the development of resources and workshops to support K-12 school counselors and administrators, who help teachers ensure all students have access to computer science education, said Hubbard.

Were working to develop scalable, sustainable, and systemic solutions that intentionally increase participation in computer science by underserved students, Hubbard said.

Continue reading here:

OSU-Cascades professor receives $628,000 ODE award to boost K-12 students' participation in computer science - KTVZ

AI learns to simulate how trees grow and shape in response to their environments – Tech Xplore

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

close

A research team from Purdue University's Department of Computer Science and Institute for Digital Forestry, with collaborator Sren Pirk at Kiel University in Germany, has discovered that artificial intelligence can simulate tree growth and shape.

The DNA molecule encodes both tree shape and environmental response in one tiny, subcellular package. In work inspired by DNA, Bedrich Benes, professor of computer science, and his associates developed novel AI models that compress the information required for encoding tree form into a megabyte-sized neural model.

After training, the AI models encode the local development of trees that can be used to generate complex tree models of several gigabytes of detailed geometry as an output.

In two papers, one published in ACM Transactions on Graphics and the other in IEEE Transactions on Visualizations and Computer Graphics, Benes and his co-authors describe how they created their tree-simulation AI models.

"The AI models learn from large data sets to mimic the intrinsic discovered behavior," Benes said.

Non-AI-based digital tree models are quite complicated, involving simulation algorithms that consider many mutually affecting nonlinear factors. Such models are needed in endeavors such as architecture and urban planning, as well as in the gaming and entertainment industries, to make designs more realistically appealing to their potential clients and audiences.

After working with AI models for nearly 10 years, Benes expected them to be able to significantly improve the existing methods for digital tree twins. The size of the models was surprising, however. "It's complex behavior, but it has been compressed to rather a small amount of data," he said.

Co-authors of the ACM Transactions on Graphics paper were Jae Joong Lee and Bosheng Li, Purdue graduate students in computer science. Co-authors of the IEEE Transactions on Visualization and Computer Graphics paper were Li and Xiaochen Zhou, also a Purdue graduate student in computer science; Songlin Fei, the Dean's Chair in Remote Sensing and director of the Institute for Digital Forestry; and Sren Pirk of Kiel University, Germany.

The researchers used deep learning, a branch of machine learning within AI, to generate growth models for maple, oak, pine, walnut and other tree species, both with and without leaves. Deep learning involves developing software that trains AI models to perform specified tasks through linked neural networks that attempt to mimic certain functionalities of the human brain.

"Although AI has become seemingly pervasive, thus far it has mostly proved highly successful in modeling 3D geometries unrelated to nature," Benes said. These include endeavors related to computer-aided design and improving algorithms for digital manufacturing.

"Getting a 3D geometry vegetation model has been an open problem in computer graphics for decades," stated Benes and his co-authors in their ACM Transactions paper. Although some approaches to simulating biological behaviors are improving, they noted, "simple methods that would quickly provide many 3D models of real trees are not readily available."

Experts with biological expertise have traditionally developed tree-growth simulations. They understand how trees interact with environmental conditions. Understanding these complicated interactions depends upon characteristics bestowed upon the tree by its DNA. These include branching angles, which are much larger for pines than for oaks, for example. The environment, meanwhile, dictates other characteristics that can result in the same type of tree grown under two different conditions displaying completely different shapes.

"Decoupling the tree's intrinsic properties and its environmental response is extremely complicated," Benes said. "We looked at thousands of trees, and we thought, 'Hey, let AI learn it.' And maybe we can then learn the essence of tree form with AI."

Scientists typically build models based on hypotheses and observations of nature. As models created by humans, they have reasoning behind them. The researchers' models generalize behavior from several thousand trees' worth of input data that became encoded within the AI. Then the researchers validate that the models behave the way the input data behave.

The AI tree models' weakness is that they lack training data that describes real-world 3D tree geometry.

"In our methods, we needed to generate the data. So our AI models are not simulating nature. They are simulating tree developmental algorithms," Benes said. He aspires to reconstruct 3D geometry data from real trees inside a computer.

"You take your cellphone, take a picture of a tree, and you get a 3D geometry inside the computer. It could be rotated. Zoom in. Zoom out," he said. "This is next. And it's perfectly aligned with the mission of digital forestry."

More information: Jae Joong Lee et al, Latent L-systems: Transformer-based Tree Generator, ACM Transactions on Graphics (2023). DOI: 10.1145/3627101

Xiaochen Zhou et al, DeepTree: Modeling Trees with Situated Latents, IEEE Transactions on Visualization and Computer Graphics (2023). DOI: 10.1109/TVCG.2023.3307887

Journal information: ACM Transactions on Graphics

Read more:

AI learns to simulate how trees grow and shape in response to their environments - Tech Xplore

A Michigan computer science major says he sent out 456 internship applications and only got 3 offers – Business Insider India

Computer science major Oliver Wu says he pulled out all the stops in his quest for a summer internship.

"456 applications, 56 interviews, and 0 sleep in 4 months, all for 1 internship," the University of Michigan junior wrote in a TikTok post published on January 11.

Wu, whose post has been viewed more than 2.9 million times, told Newsweek that he started applying even before classes resumed in the fall.

"I started applying in July, and soon I hit 200 applications," Wu said, adding that he was making 15 to 20 applications a day before the school term began.

Wu told Newsweek that he did feel "burned out" from the long-drawn search, which saw him receive "hundreds of rejections."

"The hardest part was staying positive and working hard," Wu said.

But Wu remained unfazed and pressed on with his search.

"I did not want to feel regret that I could have tried harder, so I made up my mind to pursue this with everything I had," Wu told the outlet.

Wu's efforts eventually paid off. He secured three offers and will join Ford as an enterprise technology intern this summer.

"I was in class at the time, and I remember stepping out, going into the hallway, and jumping up and down while silently screaming in excitement for around 10 minutes," Wu recounted to Newsweek.

Wu isn't the only one who has had to struggle with a slowing job market for tech.

The industry-wide layoffs, which began in end-2022, don't seem to have abated. Tech companies have continued to ax staff to streamline their operations.

Last week, Google CEO Sundar Pichai told staff to brace for more job cuts this year. Pichai said the layoffs were about "removing layers to simplify execution and drive velocity in some areas."

Wu did not immediately respond to a request for comment from Business Insider sent outside regular business hours.

Go here to read the rest:

A Michigan computer science major says he sent out 456 internship applications and only got 3 offers - Business Insider India

Chattopadhyay receives the NAS Michael and Sheila Held Prize | Cornell Chronicle – Cornell Chronicle

The National Academy of Sciences (NAS) has selected Eshan Chattopadhyay, assistant professor of computer science in the Cornell Ann S. Bowers College of Computing and Information Science, as the 2024 co-recipient of the Michael and Sheila Held Prize, along with collaborator David Zuckerman, professor of computer science at the University of Texas at Austin, for their novel work on randomized algorithms.

Given annually since 2018, the Held Prize honors "outstanding, innovative, creative, and influential research in the areas of combinatorial and discrete optimization, or related parts of computer science." The pair will receive the award, along with a $100,000 prize, at the NAS 161st Annual Meeting on April 28.

"I am deeply honored to receive this award and to have my work with David on randomness extractors recognized by the larger scientific community," Chattopadhyay said.

Read the entire story on the Cornell Bowers CIS website.

See the rest here:

Chattopadhyay receives the NAS Michael and Sheila Held Prize | Cornell Chronicle - Cornell Chronicle

Faulkner University News Faulkner University to Add Computer Engineering Degree in Fall 2024 – Faulkner University

Mike Herridge, right, speaks with WSFA news crew about Faulkners new degree.

Computer engineering is coming to Faulkner in fall 2024as the university officially adds to its offerings one of the most in-demand degrees according tocollegeconsensus.com.

Faulkners brand-new Bachelor of Science in Computer Engineering will be added to the universitys ever-popular computer science department, which for the last six years has boasted 100 percent job-placement for its graduates.

Computer engineering will fuse computer science and electrical engineering to equip students with the skills employers are looking for, ensuring theyre ready to bridge the gap between programming and the real world.

Some common employers for computer engineers are Microsoft, Cisco, Amazon, Telsa, SpaceX, General Motors, and Cummins.

With a grounding in Christian perspectives and morals, students gain preparation for careers in:

Space exploration design and control

Robotic design and application

Automotive Controls

Human to Technology Interfacing like Alexa or Hey Google

Industrial Process Control

After graduating from our program, students with a computer engineering degree will be able to marry software with hardware. Theyll be prepared to write desktop applications, develop software, create hardware as well as control and develop the programs and components that run planes, cars and rockets, saidMike Herridge, chair of the computer science department. Overall, our students from the department of computer science at Faulkner are ready to create and maintain any consumer-facing or back-end applications for industry and business. Today, every industry requires a computer expert at the heart of its business.

Technologyis growing, changing and expanding rapidly. Technology needsproblem solversandinnovators. Those students who enjoy the challenge of solving problems, creating new ways of doing things, or staying on top of the latest technology, thenFaulkners Computer ScienceDepartmentis the place to be.

The Computer Science Department of Faulkner University ishands-onwith a strong emphasis on the fundamentals of programming, hardware and software. Students will learn from instructors who work in the field and can advise on what is needed to know to be valuable in the industry.

Students who are interested in Faulkners new degree can apply now atwww.faulkner.edu/apply.

Read the original post:

Faulkner University News Faulkner University to Add Computer Engineering Degree in Fall 2024 - Faulkner University

Clark is breaking down barriers and staking her claim in cybersecurity field – UKNow

LEXINGTON, Ky. (Jan. 22, 2024) Tiffany Clark found herself entangled in a challenging predicament this fall. She grappled with conflicting options as a looming decision deadline cast its shadow. Many young adults her age would have coveted the intricacy of Clarks dilemma: Which well-paying, potentially career-making summer internship should I choose the one nestled in the sunny theme park capital of the world or the other in glitzy proximity to the famed Hollywood sign?

A junior computer science major in the University of Kentuckys Stanley and Karen Pigman College of Engineering, Clark meticulously considered the prospect of interning for either Northrop Grumman in Redondo Beach, California, or Lockheed Martin in Orlando, Florida, come the summer of 2024.

Both alternatives starkly contrast Clarks upbringing in Caneyville, Kentucky. With its modest population of just over 500, the Grayson County community offers scarce professional prospects for individuals skilled in cybersecurity and software engineering.

Following the sudden death of her mother, Tiffany Clark was raised by her maternal grandmother and, later, her maternal aunt and grandfather, alongside her older brother. Throughout her formative years, their ventures seldom extended beyond close-knit Grayson Countys boundaries. With an expanded view of the world, however, she now grapples with the notion of continuing to reside in such a sparsely populated community.

My grandfathers lived there his entire life, Clark laughs. He wont even drive outside the county. If he has to go to Hardin County to look for tractor parts, he calls up his friends to take him. Hes happy living there and not leaving the county, and thats totally fine. I just could never do that.

Clark committed to attending UK without ever setting foot on its campus. Yet, this daring leap of faith sparked a transformative college experience, awakening in her a profound wanderlust for global exploration. A testament to this profound curiosity, she spent two weeks abroad in England and France last year as part of UKs Chellgren Center for Undergraduate Excellence. She aspires to leverage a career in cybersecurity to continue discovering the diverse facets of our increasingly interconnected world and maybe catch a bad guy or two along the way.

Its mainly about having an impact, Clark explains of her career aspirations. Politically, Im very neutral, but I can see that in terms of national security, the government does a lot of great things for us that we may never even know about, and cybersecurity plays a large part in that; so, Im looking into the FBI or the CIA, or a defense contracting company that provides products and does contracts with the government.

Because it can call upon skills in both computer and electrical engineering, Clark says that she enjoys the challenge of finding computer vulnerabilities and trying to find ways people can exploit or hack into systems. She says she did not own a computer until her senior year of high school. Now, however, she is breaking down barriers and staking her claim in the still-male-dominated field of computing.

Being a female computer scientist can be great, Clark explains. It can make you stand out. But it can also be a little intimidating, maybe a bit of impostor syndrome, because your class will be dominated by males. Even with my internship last summer, the team had six interns, and I was the only girl. And in my office, there was only one woman, so it adds a little bit of oh, I need to do great. I need to stand out, but I also needed to interact with these guys all the time.

Globally, only about a quarter of people in cybersecurity roles are female. As vice president of UKs chapter of the Society of Women Engineers, Clark works to empower her peers. Through outreach, she hopes to inspire women of all ages to consider taking STEM courses and develop an inclusive community within the Pigman College of Engineering. She learned about the possible internships at the Women in Engineering conference in Los Angeles, the worlds largest conference for women in engineering and technology.

A Presidential Scholar, Chellgren Fellow and a member of the UK Lewis Honors College, Clark says the support system she found at UK has been instrumental to her academic achievements. In particular, the mentorship she receives from Stanley and Karen Pigman has been particularly gratifying.

Im so thankful to be a Pigman Scholar because of the mentorship they provide, Clark said. Stan and Karen have my phone number! Theyll call us during midterms. Theyll ask Do you need help? What do you need? Have you gone to this resource, that resource? They understand how difficult the coursework can be. Just having that kind of motivation somebody telling you Good job, youre going to do great things is really motivating.

With the Pigmans encouragement, Clark has already completed two summer internships including one in Bostonfor defense contractor Riverside and is determined to take advantage of opportunities as they present themselves.

Clarks pivotal decision to attend UK, coupled with the expansive possibilities inherent in her chosen field and the encouragement and support of mentors, didnt just create opportunities; it shattered the confines of her small-town roots.

When I get older, I really would like to travel somewhere in Asia, like South Korea, Clark enthuses. I do enjoy traveling. I want to go to a lot of different places, but definitely Asia. Its a completely different culture. Going to England or even Paris, you can still see similarities to the U.S., and a lot of people there speak English. But Asia the culture is totally different.

The closest Clark may get to Asia in the immediate future, however, could be a visit to the China or Japan Pavilions at Epcot next summer. After thoughtful discussions with those whose insights she values and meticulously weighing the pros and cons associated with each internship, she ultimately chose the opportunity Lockheed Martin in balmy Orlando.

While I did ask people for their opinion, and they gave it, everyone including the Pigmans told me that this was my decision and they were both good offers, Clark says. No one would have been disappointed, no matter which offer I accepted. Everyone told me to go with my gut and do what I want because, ultimately, its my life and my future. Both offers would have been great, and Im just sad that I couldnt do both.

In the end, the choice boiled down to what is best suited for Clarks future.

I was given an offer in Northrop Grummans space unit, Clark explains of what ultimately decided the Solomonic decision. I believe that Lockheed Martins cyber unit will align better with my interests.

At Lockheed Martin, Clark will be working in Rotary and Mission Systems, working in software engineering with the Training, Logistics and Sustainment team. Clark hopes that her decision will eventually lead to a leadership position where she can one day follow in the steps of her mentors, the Pigmans, and give back to her community and alma mater.

They really encourage the circle of giving back once youve paid your dues, says Clark, who currently works part-time in the colleges philanthropy office. Id like to do something like that, to give back to UK and the Pigman College of Engineering because theyve had such an impact on me.

See the article here:

Clark is breaking down barriers and staking her claim in cybersecurity field - UKNow

AI not yet ready to replace human workers, says MIT report – The Boston Globe

In a paper that has been submitted to the journal Management Science, Thompsons team concluded that its not enough for AI systems to be good at tasks now performed by people. The system must be good enough to justify the cost of installing it and redesigning the way the job is done. Thompson said that for now, there are a lot of places where ... humans are a more cost-efficient way to do that.

The researchers focused on AI applications that use machine vision, the ability of computers to understand visual data such as pictures and video. For instance, many smartphones have a machine vision feature that can identify objects that have been photographed by the phones camera.

The researchers chose machine vision because its a well-established technology thats already used for many tasks, like reading labels or inspecting manufactured goods for defects.

Over a three-year period beginning in 2020, the researchers identified 420 commercial and industrial tasks where an AI system with machine vision might be capable of supplanting human workers. They conducted online surveys of people with expert knowledge of how each task was performed, to figure out how much a business might lower its costs by substituting machine vision for human labor.

The researchers found that in most cases, the visual aspect of a job was just a small part of a workers responsibilities. For instance, bakers must visually inspect the loaves of bread theyre producing. But bakers spend only about 6 percent of their time doing this. Shifting this job to AI might enable the bakery to get by with fewer workers. But the cost of todays machine vision systems is much too high to justify cutting labor costs by just 6 percent. Only if AI becomes much cheaper would the tradeoff make sense.

Applying the same measurements to all 420 tasks, Thompson and his colleagues found that installing machine vision in place of human labor would lead to lower overall costs only 23 percent of the time. In the great majority of cases, human workers were cheaper.

Of course, these results apply only to the current state of the art. Thompson noted that AI systems keep getting better and cheaper. Over time, theyll be cost-effective at a much larger number of jobs.

In addition, the study only looks at machine vision applications. It doesnt include research into the impacts of the latest generative AI systems, such as ChatGPT. Thompson said his group is now working on ways to assess the potential impact of these systems. But he noted that this might take quite a while, because figuring out the potential cost savings for thousands of specific tasks requires a great deal of research.

While the new research paper has yet to go through peer review, other academics think Thompsons team is on the right track.

Christoph Riedl, associate professor of supply chain and information management systems at Northeastern University, called the study absolutely plausible. Riedl noted that it fits the pattern of other major innovations that only achieved their full potential many years after they were first invented. When we started using electricity to run factories instead of steam engines, switching to electricity didnt make factories more productive, said Riedl. We had to learn how to use this new technology to gain any benefits from it.

Joseph Fuller, a professor at Harvard Business School, where he studies the future of work, said that the latest AI innovations will have dramatic impacts on some jobs. For instance, he said generative AI is already so good at creating computer software that it will reduce demand for human programmers. Its relatively easy to teach an AI how to write software just feed it lots of examples created by humans.

But teaching an AI how to manufacture a complex object is far more difficult. I cant unleash it on some database on the accuracy of manufacturing processes, said Fuller. There is no such data, so I have nothing to train it on.

Hiawatha Bray can be reached at hiawatha.bray@globe.com. Follow him @GlobeTechLab.

Link:

AI not yet ready to replace human workers, says MIT report - The Boston Globe

New MIT CSAIL study suggests that AI wont steal as many jobs as expected – TechCrunch

Image Credits: Kirillm / Getty Images

Will AI automate human jobs, and if so which jobs and when?

Thats the trio of questions a new research study from MITs Computer Science and Artificial Intelligence Laboratory (CSAIL), out this morning, tries to answer.

Theres been many attempts to extrapolate out and project how the AI technologies of today, like large language models, might impact peoples livelihoods and whole economies in the future.

Goldman Sachs estimates that AI could automate 25% of the entire labor market in the next few years. According to McKinsey, nearly half of all work will be AI-driven by 2055. A survey from the University of Pennsylvania, NYU and Princeton finds that ChatGPT alone could impact around 80% of jobs. And a report from the outplacement firm Challenger, Gray & Christmas suggests that AI is already replacing thousands of workers.

But in their study, the MIT researchers sought to move beyond what they characterize as task-based comparisons and assess how feasible it is that AI will perform certain roles and how likely businesses are to actually replace workers with AI tech.

Contrary to what one (including this reporter) might expect, the MIT researchers found that the majority of jobs previously identified as being at risk of AI displacement arent, in fact, economically beneficial to automate at least at present.

The key takeaway, says Neil Thompson, a research scientist at MIT CSAIL and a co-author on the study, is that the coming AI disruption might happen slower and less dramatically than some commentators are suggesting.

Like much of the recent research, we find significant potential for AI to automate tasks, Thompson told TechCrunch in an email interview. But were able to show that many of these tasks are not yet attractive to automate.

Now, in an important caveat, the study only looked at jobs requiring visual analysis that is, jobs involving tasks like inspecting products for quality at the end of a manufacturing line. The researchers didnt investigate the potential impact of text- and image-generating models, like ChatGPT and Midjourney, on workers and the economy; they leave that to follow-up studies.

In conductingthis study, the researchers surveyed workers to understand what an AI system would have to accomplish, task-wise, to fully replace their jobs. They then modeled the cost of building an AI system capable of doing all this, and also modeled whether businesses specifically non-farm U.S.-based businesses would be willing to pay both the upfront and operating expenses for such a system.

Early in the study, the researchers give the example of a baker.

A baker spends about 6% of their time checking food quality, according to the U.S. Bureau of Labor Statistics a task that could be (and is being) automated by AI. A bakery employing five bakers making $48,000 per year could save $14,000 were it to automate food quality checks. But by the studys estimates, a bare-bones, from-scratch AI system up to the task would cost $165,000 to deploy and $122,840 per year to maintain . . . and thats on the low end.

We find that only 23% of the wages being paid to humans for doing vision tasks would be economically attractive to automate with AI, Thompson said. Humans are still the better economic choice for doing these parts of jobs.

Now, the study does account for self-hosted, self-service AI systems sold through vendors like OpenAI that only need to be fine-tuned to particular tasks not trained from the ground up. But according to the researchers, even with a system costing as little as $1,000, theres lots of jobs albeit low-wage and multitasking-dependent that wouldnt make economic sense for a business to automate.

Even if we consider the impact of computer vision just within vision tasks, we find that the rate of job loss is lower than that already experienced in the economy, the researchers write in the study. Even with rapid decreases in cost of 20% per year, it would still take decades for computer vision tasks to become economically efficient for firms.

The study has a number of limitations, which the researchers to their credit admit. For example, it doesnt consider cases where AI can augment rather than replace human labor (e.g., analyze an athletes golf swing) or create new tasks and jobs (e.g., maintaining an AI system) that didnt exist before. Moreover, it doesnt factor in all the possible cost savings that can come from pre-trained models like GPT-4.

One wonders whether the researchers mightve felt pressure to reach certain conclusions by the studys backer, the MIT-IBM Watson AI Lab. The MIT-IBM Watson AI Lab was created with a $240 million, 10-year gift from IBM, a company with a vested interest in ensuring that AIs perceived as nonthreatening.

But the researchers assert this isnt the case.

We were motivated by the enormous success of deep learning, the leading form of AI, across many tasks and the desire to understand what this would mean for the automation of human jobs, Thompson said. For policymakers, our results should reinforce the importance of preparing for AI job automation...But our results also reveal that this process will take years, or even decades, to unfold and thus that there is time for policy initiatives to be put into place. For AI researchers and developers, this work points to the importance of decreasing the costs of AI deployments and of increasing the scope of how they can be deployed. These will be important for making AI economically attractive for firms to use for automation.

The rest is here:

New MIT CSAIL study suggests that AI wont steal as many jobs as expected - TechCrunch

Carnegie Mellon reveals it was hit by a cyberattack over the summer – Engadget

A cyberattack hit Carnegie Mellon University last summer and the attackers breached personal data, according to a disclosure from the school last week. The Pittsburgh-based university known for its top tech and computer science programs said on Friday that the attack impacted 7,300 students, employees, contractors and other affiliates.

"There is no evidence of fraud or inappropriate use of the information from those files," a statement from CMU said. Still, the attackers likely accessed and copied data that included names, social security numbers and birth dates. With help from law enforcement, CMU disabled any access to that copied data, according to the school.

It started on August 25 when unauthorized users accessed CMU's systems. The university says it began recovery processes and an investigation into the incident that included months later in December, while notifications to impacted parties began to go out last week. Impacted parties will receive credit monitoring services to mitigate further damage.

CMU did not respond to a request for comment and further information about the attack by the time of publication.

Excerpt from:

Carnegie Mellon reveals it was hit by a cyberattack over the summer - Engadget