Category Archives: Computer Science

At tech companies, let’s share the reins | Context – Context

Large Language Models and other AI systems will only reach their full potential when a broader set of people build them

Kathy Pham is a Computer Scientist, Senior Advisor at Mozilla; Vice President of AI and Machine Learning at Workday; Adjunct Lecturer at Harvard. Opinions here are her own and not on behalf of any affiliated organization.

In 2023, we learned a lot about AI models: Whats possible, like discovering a new class of antibiotics. What's uncertain, like how exactly businesses will integrate chatbots into their operations. And how the technology might be regulated and implemented across governments, per the Blueprint for an AI Bill of Rights, the NIST Risk Management Framework, the EU AI Act, and the United States executive order on AI.

But for those within the industry or those paying close attention there was another big learning. That the people building the foundation of the most influential technology of our era are overwhelmingly from a very narrow set of disciplines: computer science, software engineering, and even more specifically, machine learning.

It makes sense for the people writing the code to be computer scientists and programmers thats our trade. But building these models is about a lot more than code. Its about identifying use cases, making design decisions, and anticipating and mitigating potential harms. In short: Its about making sure the technology isnt just functional, but also trustworthy. And this requires skill sets beyond machine learning and traditional computer science.

I know this well from my own experiences across academia, non-profits, government, and industry. The hallmark of truly impressive technology is productive applications and companies only build the best and right things when they pull together experts across disciplines.

Five years ago, I published an essay about the consequences of this bias in the industry and how we might fix it. I argued that if we want to change the tech industry to make software more fair and less harmful then we need to change what classes are required for a computer science degree.

AI has grown exponentially since then. Its tremendous computing capacity means even more unpredictable, unintended harms. My original thesis remains true: We need to expand computer science curricula to include the humanities. But those changes take time. Right now, we also need to break down the persistent engineering / non-engineering divide directly at the industry level.

Theres no shortage of stories about AI systems harms: hallucinations that generate incorrect information, disinformation, sexist and racist outputs, toxic training sets. In many cases, these are problems that engineers overlooked, but that others like cognitive scientists and media theorists confronted after the fact. The order of operations is clearly backward: These harms should be addressed before the system is deployed. Proactive, not reactive, should be the status quo, and thats only possible with more varied expertise. Its true that some companies have made progress on this front with their trust and safety teams. But those teams are still siloed and splintered.

There are some bright spots these companies can emulate: In recent years, Airbnb has centered the work of Laura Murphy, a legal and civil rights expert, to fight discrimination in product features. In 2018, Salesforce established an Office of Ethical and Humane Use a place for internal frontline employees, executives, and external experts across a broad range of functions to guide decision-making. (I was an inaugural advisory board member.) When AI really entered the zeitgeist, the office was ready for action.

It isnt a big ask for the engineers to talk with the lawyers or social scientists. A simple conversation Hey, if I create Booleans with two choices for gender in my model, is that OK? can head off bias and the need for an audit later on. Im often reminded of my late father, a truck driver. There are countless engineers working to streamline the trucking industry. But how many of them spend time speaking with actual truckers?

In addition to overt harms like bias, there are also harms of omission. A dearth of humanities experts means a missed opportunity for AI systems with clear use cases and problem-solving capabilities. Right now, many AI systems are developed and deployed first, with a goal of finding a purpose later. Consider OpenAIs 128K context window or Anthropics 200K context window: technically impressive, and lacking clear objectives until they are used for meaningful applications for society. Research and development without a clear goal isnt necessarily a bad thing, but these systems use tremendous amounts of money and energy to train. And we also know real goals exist, like better cancer detection algorithms for people with darker skin. There are powerful examples of whats possible with this approach, like the Kiazi Bora app in Tanzania. Its creators identified a problem first a lack of agricultural literacy among women in East Africa and then built an AI chatbot to help solve it.

For AI systems to be more trustworthy, tech companies must prioritize a broad range of expertise. Yes, there has been encouraging progress on this front: Initiatives like All Tech is Human are connecting engineering to the social sciences and humanities, programs like Mozillas Responsible Computing Challenge are reimagining computer science curricula as encompassing humanistic studies, and teams like Workdays Responsible AI are convening boards that spans disciplines.

Whats next? Now we need industry change to match the accelerating speed of AI development and deployment.

Read the original:

At tech companies, let's share the reins | Context - Context

Caltech Professor Named Distinguished Member of the Association for Computing Machinery Pasadena Now – Pasadena Now

Adam Wierman [Credit: Caltech]

Adam Wierman, professor of computing and mathematical sciences and director of information science and technology, has been named a Distinguished Member of theAssociation for Computing Machinery(ACM).

Many of these new 52 Distinguished Members have been selected for important technical achievements, Yannis Ioannidis, president of ACM, said. Others have been chosen because of their service and/or work in computer science education, which lays the foundation for the future of the field.

To qualify as a Distinguished Member, candidates must have spent at least 15 years in the profession and made significant contributions to the field of computing, while also serving as a mentor for others.

Wierman takes a theory-first system design approach, but his theoretical work in mathematical tools for machine learning has been quickly deployed into algorithms that optimize everything from adaptive EV charging (PowerFlex), smart grid management (SCE), and resource allocation in the cloud (Microsoft, Google), to the design of carbon-first data centers (HP, Apple). One goal of the work in our group, Wierman says, is to design the algorithms needed for 100 percent renewable-driven, carbon-free data centers.

As a professor at Caltech for the past 15 years, Wierman has mentored hundreds of undergraduates, graduate students, and postdocs in the science and art of making networked systems sustainable and resilient. He also teaches students about the present and future of computer science education from kindergarten through community college.

Working with students and collaborators across computer science and beyond has been extremely rewarding, Wierman says. It is an honor to have my work recognized by the ACM.

This years ACM honorees are both academics and industry professionals. They come not only from the United States and Canada, but also from Italy, Denmark, Finland, Hong Kong, Israel, China, Scotland, Germany, and Belgium.

Get all the latest Pasadena news, more than 10 fresh stories daily, 7 days a week at 7 a.m.

Go here to read the rest:

Caltech Professor Named Distinguished Member of the Association for Computing Machinery Pasadena Now - Pasadena Now

Artificial Intelligence’s place in the classroom – The Brown and White

The rapid evolution of artificial intelligence, commonly referred to as AI, has infiltrated college curriculums and students classes, with many Lehigh professors adjusting their teaching methods and syllabi in response.

AIs efficiency is leading to enhanced productivity that will have a profound effect on the future of education and employment. Marketing, business, computer science and journalism majors are among the most prominently impacted.

Emily Gobreski, a professor of business communications, incorporates AI into her students assignments.

She said her curriculum includes drafting emails and creating a resume, for which she now allows students to use AI as a tool.

The AI resume checker and AI Big Interview are two specific tools she is a proponent of. She said the interview tool grades students based on how many times they use the words like or um, their speaking rate and whether their responses fit the interview questions.

(AI is) almost a double-edged sword, Gobreski said. Its a very powerful tool.

Gobreski said she was never resistant to AI and now sees it like a calculator.

In elementary school, you have to know how to add, subtract and divide to work a calculator and trust its output, Gobreski said. I believe its the same with business communication writing. You learn the basics and then one day can use AI as a resourceful tool for assistance.

Kylie Hogan, 27, a marketing major, is a student in Gobreskis class that uses the AI Big Interview technology.

She said the integration of AI into her academic curriculum has enhanced her readiness for the marketing field.

The insights for AI feedback allow us to present our strongest profiles as we embark on our job search journey, Hogan said.

In addition to courses within the business school, the journalism department has also seen a rise in the use of AI.

While some professors have used AI as a tool in the classroom, others remain concerned with its use in the field.

John Vilanova, a journalism and communications professor, said he is concerned with AIs accuracy, as its responses can sometimes be patently false.

You have these technologies that we like to think are going to be neutral and not going to have hidden opinions, hidden biases, but the reality is that they are very much informed by the world that made them and the profit and motives around them, Vilanova said.

He said AIs current technology is fact-based and impersonal, putting itself and reporting in separate ballparks.

Vilanova said there will come a time when AI can lead interviews, but he is unsure how the technology will compete with a human reporter.

Will it get to know me? How good will it be at following up questions (or) respecting me? Vilanova said. I wouldnt want to tell a robot my life experiences.

Vilanova said he believes AI can never replace the significance of speaking to real people.

(AI) cant give you the feeling of what it was like to have been in the room, Vilanova said. It cant give you the sort of human experience of going and talking to someone about what participating in (a specific) event was like.

He said ChatGPT is damaging in the journalism classroom because students should be doing their own writing and developing their own style.

Getting the perfect grade, he said, should be of lesser concern than journalistic style development.

Im definitely a little disheartened, because I think despite all the ways that Ive been trying to remake things, (ChatGPT) is still being abused at a rate that Im not comfortable with, Vilanova said.

In another academic sphere, computer science majors are learning how to write and interpret code. However, AI is not yet capable of solving coding errors for them.

Brian Davison, a computer science and engineering professor, said despite AIs abilities to program and code, students are tested on their understanding of what the code is saying.

Davison said other computer science courses use AI as a tutor to answer complex questions.

Students can describe a problem and ask an AI chatbot to either tell them how to address the problem or to design a particular solution, he explained. The chatbot then comments on whether the answer is acceptable or whether its still missing something.

The system might not be 100% correct, but it does a good job in teaching or tutoring someone in a topic, and its a great way to utilize it, Davison said.

Edmond Sarkissian, 27, is majoring in computer science and business and said he sees AI used often in the classroom.

He said he has seen many students use AI to cheat on assignments and get them done quicker.

Despite this, he said he views AI as a positive development overall, and his resistance to it has lessened with the advancements the technology has made in recent years.

He said he plans to study artificial intelligence more in the future.

AI has proven helpful in education, and I think AI will continue to benefit my major in the following years, Sarkissian said.

Link:

Artificial Intelligence's place in the classroom - The Brown and White

Moshe Vardi stands up and shouts – The Rice Thresher

By Felicity Phelan 1/30/24 10:22pm

Moshe Vardi started his computer science career over 7,000 miles away, never planning to end up in the U.S.

I finished my doctorate in Israel, and then I came to the United States just to do a post-doctorate at Stanford University, Vardi said. My plan was to go back to Israel, but cherchez la femme [look for the woman] I met my wife in Palo Alto, and the rest is history.

Vardis 30-year career at Rice includes appointments as a University Professor and Distinguished Service Professor, seven honorary doctorates and a slew of science and technology awards including three IBM Outstanding Innovation Awards.

Even after 40 years in the U.S., Vardi said his personality still reflects the culture of his home country.

Israelis tend to be much more outspoken and blunt than Americans, Vardi said. I dont know if this is my strength or weakness, but Im very direct, and not everybodys comfortable with my directness.

One outlet for Vardis directness is writing. During the Fall 2020 semester, he authored several Thresher opinion pieces critical of Rice's decision to return to campus despite the ongoing COVID-19 pandemic. Another opinion, published in September 2022, discussed the purpose universities should serve in society.

Enjoy what you're reading? Sign up for our newsletter

If you look at Rices mission statement, its about education and research and betterment of the world, Vardi said. We talk a lot about education [and] we want to be a research university. We talk very little about betterment of the world. Its there in the mission statement, but where are we talking about it? In classes, in where? Its just not there.

Vardi said he feels Rice students should focus on more than just academic and financial success.

[To] the students, yes, go have a great career. But you have a responsibility for the common good, Vardi said. Nobodys asking you to necessarily become Mother Teresa and go and spend your life in Calcutta feeding the poor, okay? But if everybody just thinks of themselves, we have a society that looks like the society we have.

In his support for the common good, Vardi is passionate about the interaction between technology and society. Hes been leading Rices Technology, Culture and Society Initiative since its founding in 2018 and was heavily involved in the recent change to make Rices computer science curriculum more ethics-focused. In 2018, Vardi began teaching COMP 301: Ethics and Accountability in Computer Science. In Spring 2023, the computer science department voted to make it a required undergraduate course.

The students really like the class because it talks about what computing is doing to society and social responsibility, Vardi said. There is no final exam in the class; its not a theoretical class. We ask them to write an essay about their own personal social responsibilities. And these essays to me are very, very moving.

Rodrigo Ferreira, one of Vardis colleagues in the computer science department and the current professor of COMP 301, says he admires Vardis commitment to pushing his values forward.

I think [Vardi] stands out in that he uses his position as a kind of springboard to talk about the social and ethical issues that matter to him, Ferreira said. He always finds time to push forward this initiative and this set of values that he believes are important and that I, too, believe are essential for the future of computer science and the world.

Vardi has also been outspoken about his experiences as a Jewish person on campus. On Dec. 6, 2023, he published an essay on Medium titled "A Moral Rot at Rice University," which he wrote in response to what he calls a pattern of antisemitism on Rice's campus, pointing to the passing of Student Association Senate Resolution 14.

Vardi said his original Medium essay received 30,000 views within two weeks of publication, far exceeding his expectations. The essay was also republished in the Houston Chronicle and The Times of Israel.

Though Vardi said the response to his essay was overwhelmingly positive, he did receive backlash. About a week after the essays publication, Vardi said an unknown person outside of the Rice community mailed him a note that read, Americans are sick of Jews whining [about] how discriminated against they are. Most Jews are obnoxious [and] greedy. Look at yourself. Jews bring on most of their own problems.

Vardi says this hatred, though uncomfortable, informs his identity.

I may look white. I dont consider myself white. I open my mouth; immediately people see Im not from here, Vardi said. [They ask] Where are you from? I say Im from Israel, so it takes about 30 seconds for people to realize that Im Jewish, and then I stand aside. The people who hate Black people also hate me.

Negative or positive, Vardi says the potential response didnt factor into his decision to write and publish the Medium article.

I wasnt thinking about what kind of response Im going to get, Vardi said. I just have to say, this is not acceptable. I have to stand up and shout.

See original here:

Moshe Vardi stands up and shouts - The Rice Thresher

Computer modeling is tracing the hidden evolution of sign languages – Popular Science

Its relatively easy to trace a written linguistic historytheres generally a lot of written documentation and records to study. Things get trickier, however, when attempting to examine a sign languages evolution. Most transformations within the currently over 300 known sign languages (or SLs) around the world occurred sans text over generations of learners. Add in the centuries of marginalization experienced by Deaf and hard of hearing communities, and establishing concrete relationships between SLs becomes extremely difficult.

To help correct this long standing issue, researchers recently created a novel computational program capable of analyzing the relationships between various SLs. The result, published today in Science, is a first-of-its-kind large-scale study that greatly expands on linguists understanding of sign language development while challenging long held beliefs about its evolution.

[Related: Online classes are difficult for the hard of hearing. Heres how to fix that.]

Many people mistakenly think that sign language is shared around the world, but really the world is full of a vibrant tapestry of different sign languages, Natasha Abner, study lead author and an associate professor of linguistics at the University of Michigan, writes in an email to PopSci.

For their study, Abner and her colleagues first compiled a video dictionary of core, resilient vocabulary across 19 modern sign languages, such as American, British, Chinese, French, Japanese, and Spanish, among others. For example, while a sign for oak tree may only occur in languages spoken in regions with oak trees, the concept of just a tree is much more ubiquitous. Researchers then broke down video demonstrations for the 19 signing variants for tree (along with many other words) into basic phonetic parameters, then entered it all into a massive database.

What we do in the study is look at how the sign languages refer to these commonplace, universal objects in the world and we work backwards to build a history of the language and languages, Abner says. This built history helps us understand the histories of the communities in ways that the historical records cannot because they are so limited and sparse.

The computational analysis program then examined the signed vocabulary glossary, categorizing each entry based on intricate factors like handedness (one- or two-handed signs), handshape, location, and movement.

This coding system avoids outcomes driven by superficial similarities or differences in two key ways, reads a portion of the teams study. One, possible character values in the coding system range from two distinct values (handedness) to 10 distinct values (handshape), so it is a highly articulated system capable of capturing and tracking fine-grained differences.

In tracing signed vocabularies evolutions, researchers applied phylogenetic analysis typically associated with biologically inherited traits to physically conveyed communications.

In our study, the genes of language are the words that the languages use to describe the world around them, says Abner. Pursuing this strategy meant that, instead of simply applying existing computational methods to sign language data, Abners team used sign languages as the empirical basis for advancing the computational methods themselves.

[Related: The language you speak changes your perception of time.]

After examining the dataset, the teams program established two wholly independent European and Asian sign language families alongside family trees for each one, as well as two distinct Asian sign language subfamilies. Some of the findings reinforced the already known, lasting effects of Western colonization, such as the relationship between British, Australian, and New Zealand sign languages at the expense of endangered or extinct indigenous variants.

Meanwhile, the documented influence of French sign language within the Western European language tree is backed up by Frances help in expanding Deaf education schools during the 18th century. At the same time, the new computational analysis also revealed previously undocumented connections between British Sign Languages and Western European varieties. To back up the programs claim, Abners team referred back to limited historical records, and found them to corroborate these links.

Abner believes these findings, alongside future advances, will allow sign language linguists the ability to study even more languages and Deaf communities.

We view this as an important component of demonstrating the equity between signed and spoken languages, and the fact that both are rooted in the biological capacity for language that is part of what makes us human, she tells PopSci.

If we want to understand our humanity, then we cannot limit ourselves to spoken languages.

See original here:

Computer modeling is tracing the hidden evolution of sign languages - Popular Science

American Express offers jobs, internships to 25 FIU students – FIU News

Barbarella Castillo graduated from a private university shortly after the start of the COVID-19 pandemic. She had plans to pursue medical school but, like the rest of the world, she put those on hold. After re-evaluating her options, she decided to pursue her interest in computing and enrolled in the Knight Foundation School of Computing and Information Sciences at FIU'sCollege of Engineering and Computing.

It was an excellent decision, says Castillo, who in 2023 graduated from FIU with a second bachelors degree and a full-time job waiting for her at American Express. She landed the position with the help of a unique three-week internship facilitated by Break Through Tech Miami.

American Express Vice President of Technology Donna Peters, who hosted some of the interns, says partnering with programs like Break Through Tech, which is designed to increase female employment in technology, helps support American Expresss goal of embracing diversity to fuel creativity and innovation.

We firmly believe that innovation thrives on diverse skillsets, backgrounds, and experiences, says Peters. What is most satisfying for me is witnessing the aha moments when the pieces of the puzzle come together, and student learnings come to life.

"Sprinternships," as they are called,are micro-internships designed to bridge the gap between academia and industry. Unlike traditional internships, they dont require students to ace a technical interview as a prerequisite. This removes one of the most common stumbling blocks for students attempting to enter the workforce, says Director of Break Through Tech Miami Nimmi Arunachalam.

Hiring managers can get a better sense for a students long-term potential by seeing them perform in a natural environment rather than in a hypothetical, problem-solving scenario in front of a whiteboard, like what is typically presented in a traditional tech interview, Arunachalam said.

Evaluating students in an organic setting, where they work on a team and solve a challenging project, could be a better way of predicting future performance and fit.

Twenty-five students participating in the inaugural cohort received either a job offer or an internship offer from American Express.

View original post here:

American Express offers jobs, internships to 25 FIU students - FIU News

Monday, February 5, 2024 | Daily Bulletin – The Iron Warrior

Ming Li receives 2024 IEEE Computer Society W. Wallace McDowell Award

This is an excerpt of an article originally published on Waterloo News.

University Professor Ming Li is the 2024 recipient of theIEEE Computer Society W. Wallace McDowell Award, a prestigious honour conferred for his pioneering and enduring contributions to modern information theory and bioinformatics.

Named after W. Wallace McDowell, director of engineering at IBM during the development of the landmark IBM 701 computer, the award has been conferred annually since 1966 by the IEEE Computer Society for outstanding theoretical, design, educational, practical, or other innovative contributions.

Congratulations to Ming on receiving the 2024 Wallace McDowell Award, said Raouf Boutaba, Professor and Director of the Cheriton School of Computer Science. This recognition is richly deserved. Ming has not only been a pioneer in Kolmogorov complexity and in doing so has laid the foundation for a modern information theory, but he is also an innovator in computational biology by applying machine learning to develop personalized immunotherapies to treat cancer and infectious diseases.

University Professor Li completed his PhD at Cornell University in 1985, followed by a postdoctoral fellowship at Harvard. In 1989 he joined the University of Waterloo as an Associate Professor in what was then the Department of Computer Science, and was promoted to Full Professor in 1994. He was named the Canada Research Chair in Bioinformatics, a prestigious position he has held continuously since 2002. In 2009 he was named a University Professor, a title conferred to University of Waterloo faculty to recognize exceptional scholarly achievement and international pre-eminence.

Read the full article on the School of Computer Science website.

A message from the Faculty of Mathematics.

It is always a difficult balance to strike, said Mark Giesbrecht, dean of the Faculty of Mathematics, between reflecting what we believe, and defining actional things we can go forward with and actually apply in our everyday life.

Giesbrecht was welcoming participants to the January 31 launch of theEquity and Inclusive Communities Principles, a new set of eleven principles developed over the last two years for the Faculty of Mathematics by a committee of eighteen faculty, staff, and students, including members from theOffice of Indigenous Relationsand the Centre for Teaching ExcellencesIndigenous Knowledges and Anti-Racist Pedagogies Unit. This work was alsosupported byWaterloosEquity, Diversity, Inclusion & Anti-Racismoffice.

The principles were particularly designed to be accessible to people regardless of where they were on their journey, explains Jeremy Steffler, the Faculty of Mathematics Equity Officer and leader of the project.

One of the great understandings gained through this process was an appreciation of the sense of connection and the degree of reciprocity the Faculty of Mathematics has with many communities, Steffler says. While the Principles are intended to guide our efforts toward greater equity and inclusion, the Faculty is part of a university and other external communities, and many diverse communities exist within the Faculty. Choosing to acknowledge communities right in the name helps frame our intentions to be thinking in a more holistic way.

Read the full article on the Faculty of Mathematics website.

What is happening?We will be testing the Universitys campus-wide emergency communication system.

When is this happening?Wednesday, February 7 at 1:30 p.m.

What is the impact?Emergency communication channels being tested include:

Messaging:The message displayed will readTEST of the UW Emergency Notification System. During an actual emergency or threat you would receive instructions. No action is required.More information regarding the emergency would be available atalert.uwaterloo.ca.

Approximately 15 minutes after the test activation message is sent, a deactivation message will display,The test of the UW Emergency Notification System is complete.

In the event of a real emergency during this test, please contact Police Services at 519-888-4911, or ext. 22222.

Be sure to install the WatSAFE app on your device and WatSAFE Desktop Notification tool on your desktop/laptop to receive this test message, and more importantly, to stay informed of campus emergency situations. Visit theWatSAFE websitefor more details.

Questions or concerns?Please contact the IST Service Desk via theIST Help Portal.

A message from the Secretariat.

The University of Waterloo Senate is comprised of faculty, undergraduate and graduate students, alumni, governors, and administrative staff who are both elected and ex-officio members. This governing body of the institution is the highest authority on academic matters and meets regularly to discuss topics such as academic programs, educational policies, appointments, and other academic priorities.

At the January 29, 2024 meeting, the following items were approved:

Additionally, Senate received updates for information:

Visit the Senate website to find the meeting agenda and related materials for all recent Senate meetings, as well as dates for upcoming meetings. The Secretariat will make Zoom links available to those wishing to attend meetings virtually.

If you have any questions, please contact the Secretariat at senate@uwaterloo.ca.

The 5thannual "Mirrorless Monday" campaign is happening in washrooms across the University campus today.MirrorlessMonday is a movement to encourage body acceptance through various messages on mirrors and a campaign thatseeks to remind everyone that their self-worth is not tied to a reflection in a mirror.

During the one-day campaign,select washroom mirrorsacross campus will be partly coveredwith supportive messages and resources including Campus Wellness, Counselling Services, MATES, Good2Talk, and NEDIC. The messages will be placed on the mirrors at the beginning of the day and removed by 10:00 a.m. the following day.The Mirrorless Monday messages will be shared on the@uwaterloolife Instagram account.

It's election season for the Waterloo Undergraduate Student Association (WUSA), as candidates vie for Board of Director and executive positions.Candidate Booth Dayswill betaking place today in the SLC Marketplace and will run on February 7 and 8 in the SLC Marketplace from 10:00 a.m. to 7:00 p.m. The election will be held from February 12 to 14.

Read more:

Monday, February 5, 2024 | Daily Bulletin - The Iron Warrior

Technological University Dublin: From computer science basics to brilliance – Study International News

From artificial intelligence revolutionising healthcare to blockchain transforming finance, the computer science field stands as the heartbeat of innovation one that holds a lot of power.

In the 21st century, computer science is indisputably one of the main disciplines inspiring new technologies and innovations for a better world.

Hence for mid-career professionals, diving into the world of algorithms and code is not merely a career path, but a gateway to shaping the future.

Recognising the growing interest in reskilling, the School of Computer Science at Technological University Dublin (TU Dublin) has designed a distinct pathway programme designed for graduates from non-computing disciplines. Its the ideal programme to obtain the essential computing concepts required for entry into the MSc in Computing programme.

Source: Technological University Dublin

The masters qualifier conversion programme is as innovative as its flexible. With no assumptions about prior computing knowledge, this full-time course lays a solid foundation in the core concepts of computer science, focusing on both theory and practice. It comprises five essential modules: Object-Oriented Software Development; Information Systems; Architecture, Operating Systems, and Networks; Web and User Interface Design; as well as Systems Analysis and Testing.

Each module integrates lecture sessions and hands-on lab sessions, where students actively apply theoretical concepts in practical tasks. Lecturers and tutor demonstrators are readily available during these sessions to offer guidance and support, ensuring a well-rounded learning experience.

You will learn to programme using Python, how to design and build software and websites, how to access and build databases, and all about the architecture of computers and their operating software, explains course chair Jonathan McCarthy.

Breadth aside, perhaps the best part about this pathway is its 100% online delivery. Weekly live online lectures and lab sessions provide ample opportunities for interaction with lecturers. With a continuous assessment strategy, students can complete all coursework online, eliminating the need to attend campus and offering the flexibility needed to balance professional and academic commitments.

With a solid foundation, those who complete the programme will be able to add a Continuing Professional Development Diploma in Fundamentals of Computing to their CV. However, those who do well, especially in the Object-Oriented Software Development module, and achieve an overall average mark of 60%, can progress directly to TU Dublins MSc in Computer Science (Data Science) or MSc in Computer Science (Advanced Software Development).

The aim of these MSc programmes is to develop deep practical and technical knowledge in a specialist area with exposure to leading-edge topics, while also developing the professional, research and technical abilities of students, shares Andrea Curley, programme chair of the MSc Computer Science.

Source: Technological University Dublin

The Data Science programme was designed to equip students with both technical prowess, including data management, mining, and machine learning, and the essential soft skills in communication, research, and problem-solving. Graduates not only navigate the complexities of vast datasets but also derive meaningful insights, making them invaluable assets in any organisational landscape.

On the other hand, specialising in Advanced Software Development entails gaining a better understanding of the evolving global digital economy. The programme digs deep into advanced technical modules covering programming, design, databases, architecture, and web development.

Beyond technical expertise, much like the Data Science specialisation, this curriculum places a strong emphasis on honing key professional and technical communication skills. This ensures that graduates can contribute as proficient software developers but thrive as industry professionals.

Both MSc programmes are delivered in full-time mode and part-time mode, says Curley. Full-time mode takes three semesters or approximately 16 months, while part-time mode is much more flexible due to its modular approach. Students are allowed to choose the pace at which they move in these programmes.

All routes guarantee professional success, partially due to the fact that TU Dublin prides itself on strong industry engagement. Internships, industry projects, and industry accreditation are seamlessly integrated into programmes. In fact, the British Computer Society, in an exemplary commendation, recognises its commitment to bridging the gap between academia and industry.

Those who choose to broaden their intellectual horizons on campus will be pleased to know they will always be surrounded with opportunities. Nestled in the heart of the Silicon Valley of Europe, the Dublin campus stands amidst the highest concentration of ICT activity. In fact, TU Dublin plays a pivotal role in this thriving ecosystem, fostering partnerships with major industry players, including the top eight ICT multinationals with European headquarters in Dublin.

To learn more about the School of Computer Science at TU Dublin, where the future of technology is not just learned but shaped, click here.

Follow Technological University Dublin on Facebook, Instagram, X, LinkedIn, YouTube, and TikTok

Read more:

Technological University Dublin: From computer science basics to brilliance - Study International News

20 films about math, mathematicians and math geniuses – Yardbarker

The words spy movie and Robert Redford usually make one think of the 1975 classic Three Days of the Condor, but thats not the only film fitting this description. In 1992, Redford starred alongside Dan Aykroyd, Ben Kingsley, Mary McDonnell, River Phoenix, Sidney Poitier and David Strathairn in Sneakers, a spy caper about a group of hackers, techies and espionage experts who are tasked by the government to steal a code-breaking device, only to get tangled up in the investigation of a mathematicians murder. For a lighthearted drama, the producers of Sneakers took the math aspects seriously and hired Leonard Adleman as a mathematical consultant. At the time, Adleman was mainly known for being the person who coined the term virus. However, he is now viewed as the father of DNA computing and was the winner of the 2002 Turing Award for his co-creation of the RSA encryption algorithm, which is now widely used in secure data transmissions.

Continue reading here:

20 films about math, mathematicians and math geniuses - Yardbarker

World’s 1st fault-tolerant quantum computer launching this year ahead of a 10000-qubit machine in 2026 – Livescience.com

The world's first commercial fault-tolerant quantum computer with "logical qubits" may be running before the year's end.

Logical qubits physical quantum bits, or qubits, connected through quantum entanglement reduce errors in quantum computers by storing the same data in different places. This diversifies the points of failure when running calculations.

The new machine, which has 256 physical and 10 logical qubits, will launch in late 2024, representatives from QuEra, the startup that is building it, said in a statement.

The announcement follows a new study, published Dec. 6, 2023 in the journal Nature, in which researchers from Harvard, QuEra and several other institutions demonstrated a functioning quantum computer that contained 48 logical qubits the largest number of logical qubits tested to date.

Related: Scientists just built a massive 1,000-qubit quantum chip, but why are they more excited about one 10 times smaller?

"It is the first machine with quantum error correction," study co-author Harry Zhou, a physicist at QuEra and Harvard University, told Live Science in an email.

While this computer doesn't have enough power to be useful on its own, it provides a platform on which software programmers can start testing code for future quantum computers, Zhou said.

While conventional computers store information in bits with a value of either 0 or 1, quantum computers use qubits which are a superposition between 0 and 1, thanks to the laws of quantum mechanics.

Qubits can also be stitched together using quantum entanglement to exist in multiple states simultaneously. This enables them to perform many calculations much faster than classical computers assuming you can build a quantum computer with enough of them. But qubits can easily be disturbed, making them notoriously error-prone. Roughly 1 in 1,000 fail, versus 1 in 1 billion billion bits in conventional computers.

Quantum computers could outpace the best supercomputers if they incorporate millions of qubits, but the largest quantum computer built so far only has around 1,000 qubits, and qubits' high failure rate limits potential scale-up. Error correction could counteract qubits' tendency to fail, and building logical qubits is one way of doing it.

The new error-correction system relies on data redundancy, where the same piece of data is stored in multiple places, Zhou said. Logical qubits perform the same calculations across several physical qubits vastly reducing error rates if one or more physical qubits fail, because the data is available elsewhere so calculations can continue.

To make the logical qubit, researchers applied error-correcting computer code to regular qubits. They then set up logical gates, or circuits, between the qubits to entangle them. The quantum computer then calculates the 'syndrome' a measure of whether it's likely an error has occurred or not. Using this information, the quantum computer corrects the errors and proceeds to the next step.

The new qubits represent a significant advance over past efforts. In 2023, the Google Quantum AI Lab demonstrated a 2.9% error rate using three logical qubits; Quera's error rate is 0.5% with 48 logical qubits. The world leader is the University of Oxford, which has achieved error rates of less than 0.01% but only between two-qubit gates.

Last year, IBM also demonstrated error-correction technology in its 127-qubit Heron chip which reduced error rates fivefold compared with its other chips. But its first commercial fault-tolerant machine isn't expected until 2029.

QuEra plans to launch several quantum computers in the coming years, starting with a 30-logical-qubit, 3,000 physical qubit machine coming out in 2025. Its monster, a machine with more than 10,000 physical qubits and 100 logical qubits, is scheduled for 2026. "At 100 logical qubits, the [2026] machine can perform correct calculations that exceed the capability of todays supercomputers," Zhou said.

Read the original:

World's 1st fault-tolerant quantum computer launching this year ahead of a 10000-qubit machine in 2026 - Livescience.com