Category Archives: Computer Science
Machine and Human Factors in Misinformation Management – Information Processing and Management Conference – Knovel
Title of the Special Issue/Thematic Track
Machine and Human Factors in Misinformation Management (VSI: IPMC2022 MISINFO)
- Damiano Spina (*), Senior Lecturer and DECRA Fellow, School of Computing Technologies, RMIT University, Melbourne, Australia. email: damiano.spina@rmit.edu.au
- Kevin Roitero, Postdoctoral Research Fellow, Department of Mathematics, Computer Science, and Physics, University of Udine, Udine, Italy. email: kevin.roitero@uniud.it
- Stefano Mizzaro, Full Professor, Department of Mathematics, Computer Science, and Physics, University of Udine, Udine, Italy. email: mizzaro@uniud.it
- Gianluca Demartini, Associate Professor, School of Information Technology and Electrical Engineering, The University of Queensland, Brisbane, Australia. email: g.demartini@uq.edu.au
- Kalina Bontcheva, Full Professor, Department of Computer Science, The University of Sheffield, United Kingdom. email: k.bontcheva@sheffield.ac.uk
(*) Managing Guest Editor.
The rise of online misinformation is posing a threat to the functioning of the overall democratic process. Nowadays, it has been observed that there is an exponential growth of false information spread across the web and social network platforms; this expansion is also connected with the development of novel tools (e.g., large language models) that are able to process and generate large amounts of data. This has enabled the increase of large-scale counter-narratives and propaganda strategies in online communities, which have a major negative impact and can influence individuals and collective decision-making processes. To contrast this worrying trend, researchers are working on the development of data-driven and hybrid algorithmic methods with the aim of detecting misinformation and to control its spread. The proposed algorithms and solutions are complex and can be classified in different categories based on the underlying approach considered: fully automatic algorithms based on artificial intelligence, machine learning, and deep learning; human powered systems, either based on panels of experts or on crowdsourcing workers; and hybrid human-in-the-loop approaches, that try to fruitfully mix the above approaches. A better understanding on how humans and machines can effectively work together in the process of managing and fighting misinformation is needed.
The aim of this special issue is to accept submissions dealing with artificial, human, and hybrid techniques aimed at fighting the spread of misinformation.
Topics of interest include, but are not limited to:
- Predictive models to model and fight misinformation spread (e.g., trust and reputation models, formal models, online misinformation diffusion models, forecasting models).
- Machine learning, deep learning, transfer learning, reinforcement learning, graph based approaches, and probabilistic methods (e.g., classification, unsupervised / semi-supervised / supervised learning, applications, architectures, loss functions, training approaches) applied to fight misinformation.
- Infrastructures and resources for misinformation management (e.g., datasets, implementations, frameworks, architectures).
- Fairness, accountability, transparency, and safety of systems and processes to fight misinformation.
- Use of social media to study and combat misinformation online.
- Human computation and crowdsourcing methodologies to fight misinformation.
- Hybrid and multi-agent approaches to fight misinformation.
- Biases in artificial, human, and hybrid systems used to address misinformation.
- Adversarial approaches to misinformation (e.g., robustness of systems, automatic generation of misinformation).
- Information provenance and traceability.
- Filtering and recommendation systems for content dealing with misinformation (e.g., content-based filtering, collaborative filtering, recommender systems).
- User-centered (e.g., user experience, effectiveness, engagement) and system-centered (e.g., metrics, experimental design, benchmark) evaluation.
- Fighting Multimedia misinformation (text, audio, image, and video; deep fakes).
- Fighting Multi- and cross-lingual misinformation.
- Generation of explanations and explainable algorithms to deal with misinformation.
- Regulation, policies, and socio-economical perspectives on misinformation and approaches to fight misinformation.
- Influence and psychological aspects of misinformation.
- Social network analysis, influencer detection of misinformation, and fake news spreader profiling.
- Corpora, annotation, and test collections (including tools and resources) to build and evaluate systems and processes to fight misinformation.
Submit your manuscript to the Special Issue category (VSI: IPMC2022 MISINFO) through the online submission system of Information Processing & Management. https://www.editorialmanager.com/ipm/
Authors will prepare the submission following the Guide for Authors on IP&M journal at (https://www.elsevier.com/journals/information-processing-and-management/0306-4573/guide-for-authors). All papers will be peer-reviewed following the IP&MC2022 reviewing procedures. Please note IP&Ms strict no pre-print policy outlined in the author guidelines.
The authors of accepted papers will be obligated to participate in IP&MC2022 and present the paper to the community to receive feedback. The accepted papers will be invited for revision after receiving feedback on the IP&MC 2022 conference. The submissions will be given premium handling at IP&M following its peer-review procedure and, (if accepted), published in IP&M as full journal articles, with also an option for a short conference version at IP&MC2022.
Please see this infographic for the manuscript flow:https://www.elsevier.com/__data/assets/pdf_file/0003/1211934/IPMC2022Timeline10Oct2022.pdf
For more information about IP&MC2022, please visit https://www.elsevier.com/events/conferences/information-processing-and-management-conference
Read more:
FIRST Robotics Competition inspires a new wave of local talent in information technology – News 5 Cleveland
CLEVELAND A lot of times it's the athletes on the field who get all the attention but there's a competition making a comeback this year that has a lot of "STEM-letes" flexing their muscles.
"We like to refer to it as a varsity sport of the mind," said JonDarr Bradshaw, the community engagement coordinator at Great Lakes Science Center and a volunteer with FIRST FRC Robotics.
After COVID cancellations, the FIRST Robotics competition is back. The Buckeye Regional qualifying matches were held in Cleveland at the Wolstein Center in 2018. They will return this March and the world championships will follow in April in Texas.
"It's the most intense and amazing experience we can give our students to get them interested in IT," said Chelsey Kohn, director of the Tech Talent Pipeline for CMSD and Cleveland State. "They build an extremely large robot. They go to a really fun competition and they get to learn about teamwork and helping each other along the way."
News 5 Cleveland.
While many Northeast Ohio schools participate in FIRST FRC Robotics, it's rare for large urban districts to compete, and this year four Cleveland schools have teams.
Grants from the Great Lakes Science Center help make it possible. Specifically, reaching out to young women and minorities; both of whom are underrepresented in engineering fields.
"Part of our success as a nation is the fact that we are so diverse," said Bradshaw. "It brings in new ideas, new viewpoints and ways to tackle problems that are different."
Bringing their talents to the table include students like Quynh Tran, a senior at John Marshall School of Technology.
"Growing up I didn't know a lot of female engineers or computer scientists, so I want to be that example for everyone else," she said.
East Tech senior Jamyah Howard said he's inspired by his engineering teacher, Shawn Thomas, to pursue robotics.
"I really like the mechanical part of it," he said.
News 5 Cleveland.
Thomas said there's something special about Howard and she knows he has a bright future.
"I've been working with him since his freshman year," she said. "I'm counting on him to do some positive things and continue on in the field of engineering."
News 5 Cleveland.
Exposure and opportunity are so important for young minds. So is the support.
The teams have mentors during the six-week after-school program, helping them build their robots. Like Timothy Hatfield, who competed in First Robotics himself in high school and now works at PHASTAR Corporation.
"It feels amazing to pass down what I've learned to other students, and that they can carry the work on in hopes that they also become mentors themselves," said Hatfield.
The hope is also that the students become employed here at home in the IT field. It is one of Northeast Ohio's top industries; offering some of the most in-demand jobs with the best wages.
"If they're a student who likes IT there's a place for them," said Kohn. "You know, Cleveland needs home-grown Cleveland talent. Companies are going to leave the region if they don't get their positions filled."
Download the News 5 Cleveland app now for more stories from us, plus alerts on major news, the latest weather forecast, traffic information and much more. Download now on your Apple device here, and your Android device here.
You can also catch News 5 Cleveland on Roku, Apple TV, Amazon Fire TV, YouTube TV, DIRECTV NOW, Hulu Live and more. We're also on Amazon Alexa devices. Learn more about our streaming options here.
View original post here:
The convergence of deep neural networks and immunotherapy – TechCrunch
Luis Voloch is the CTO and co-founder of Immunai. He was previously Israel Tech Challenges head of data science, worked on varied machine learning efforts at Palantir and led the machine learning initiatives for ML modeling of DNA data at MyHeritage.
What do deep neural networks and cancer immunotherapy have in common?
While both are among the most transformational areas of modern science, 30 years ago, these fields were all but ridiculed by the scientific community. As a result, progress in each happened at the sidelines of academia for decades.
Between the 1970s and 1990s, some of the most prominent computer scientists, including Marvin Minsky, in his book Perceptrons, argued that neural networks (the backbone of most modern AI) would never work for most applications. He exposed flaws in the early conceptions of neural networks and argued that the whole approach was ineffective.
Meanwhile, during the 1980s through the 2000s, neural network pioneers and believers Geoffrey Hinton, Yoshua Bengio, and Yann LeCun continued their efforts and pursued their intuition that neural networks would succeed. These researchers found that most of the original ideas were correct but simply needed more data (think of ImageNET), computational power and further modeling tweaks to be effective.
Hinton, Bengio and LeCun were awarded the Turing Award in 2018 (the computer science equivalent of a Nobel prize) for their work. Today, their revelations have made neural networks the most vibrant area of computer science and have revolutionized fields such as computer vision and natural language processing.
Cancer immunology faced similar obstacles. Treatment with IL-2 cytokine, one of the first immunomodulatory drugs, failed to meet expectations. These outcomes slowed further research, and for decades, cancer immunology wasnt taken seriously by many cancer biologists. With the effort and intuition of some, however, it was discovered decades later that the concept of boosting the immune system to fight cancer had objective validity. It turned out that we just needed better drug targets and combinations, and eventually, researchers demonstrated that the immune system is the best tool in our fight against cancer.
James P. Allison and Tasuku Honjo, who pioneered the class of cancer immunotherapy drugs known as checkpoint inhibitors, were awarded the Nobel Prize in 2018.
Though widely accepted now, it took decades for the scientific establishment to accept these novel approaches as valid.
Machine learning and immunotherapy have more in common than historical similarities. The beauty of immunotherapy is that it leverages the versatility and flexibility of the immune system to fight different types of cancers. While the first immunotherapies showed results in a few cancers, they were later shown to work in many other cancer types. AI, similarly, utilizes flexible tools to solve a wide range of problems across applications via transfer and multitask learning. These processes are made possible through access to large-scale data.
Heres something to remember: The resurgence of neural networks started in 2012 after the AlexNet architecture demonstrated 84.7% accuracy in the ImageNET competition. This level of performance was revolutionary at the time, with the second-best model achieving 73.8% accuracy. The ImageNET dataset, started by Fei-Fei Li, is robust, well labeled and high quality. As a result, it has been integral to how far neural networks have brought computer vision today.
Interestingly, similar developments are happening now in biology. Life sciences companies and labs are building large-scale datasets with tens of millions of immune cells labeled consistently to ensure the validity of the underlying data. These datasets are the analogs of ImageNET in biology.
Were already seeing these large, high-quality datasets giving rise to experimentation at a rate and scale that was impossible before. For example, machine learning is being used to identify immune cell types in different parts of the body and their involvement in various diseases. After identifying patterns, algorithms can map or predict different immune trajectories, which can then be used to interpret, for example, why some cancer immunotherapies work on particular cancer types and some dont. The datasets act as the Google Maps of the immune system.
Mapping patterns of genes, proteins and cell interactions across diseases allows researchers to understand molecular pathways as the building blocks of disease. The presence or absence of a functional block helps interpret why some cancer immunotherapies work on particular cancer types but not others.
Mapping pathways of genes and proteins across diseases and phenotypes allows researchers to learn how they work together to activate specific pathways and fight multiple diseases. Genes can be part of numerous pathways, and they can cause distinct types of cells to behave differently.
Moreover, different cell types can share similar gene activities, and the same functional pathways can be found in various immune-related disorders. This makes a case for building machine learning models that perform effectively on specific tasks and transfer to other tasks.
Transfer learning works in deep learning models, for example, by taking simple patterns (in images, think of simple lines and curves) learned by early layers of a neural network and leveraging those layers for different problems. In biology, this allows us to transfer knowledge on how specific genes and pathways in one disease or cell type play a role in other contexts.
AI research that addresses the effects of genetic changes (perturbations) on immune cells and their impact on the cells and possible treatments is increasingly common in cancer immunology. This kind of research will enable us to understand these cells more quickly and lead to better drugs and treatments.
With large-scale data fueling further research in immunotherapy and AI, we are confident that more effective drugs to fight cancer will appear soon, thus giving hope to the over 18 million people who are diagnosed with cancer every year.
More:
The convergence of deep neural networks and immunotherapy - TechCrunch
EeroQ Chooses the Terminal in Chicago for 9,600 Square Foot Quantum Computing Lab – Yahoo Finance
Next generation computer to be built in Humboldt Park, generating momentum for Chicagos INVEST South/West initiative and quantum computing hub
CHICAGO, January 06, 2022--(BUSINESS WIRE)--EeroQ, a leader in the field of quantum computing, is moving its headquarters to Chicago at The Terminal, an IBT Group development. EeroQ has signed a lease for a 9,600 square foot engineering lab and office, which will continue the strong momentum for both Chicago quantum computing and Mayor Lightfoots INVEST South/West initiative, as well as The Terminals position as a premier hub for advanced technology in Chicago.
Quantum computing (QC) offers immense potential for the creation of new pharmaceuticals and materials, financial algorithms, and more. In a QC, information is stored in qubits, which go beyond the 0s and 1s of todays computers. While this opens the doors to solving problems which are impossible for todays computers, building a large scale and reliable QC is an immense challenge which has not yet been met. EeroQ, with an engineering team led by professors from Princeton University and Michigan State University, offers a unique chip design with the potential to leapfrog to the front of the race. Todays QC players include IBM, Google, Honeywell, Amazon, and many other large enterprises. Alongside such large players, EeroQ is a founding member of the United States Quantum Industry Coalition.
"This is great news for a couple of reasons. First, EeroQs move into The Terminal in Humboldt Park is aligned with our INVEST South/West strategy, which allows us to invest in under-resourced neighborhoods across the South and West Sides by creating new jobs, businesses and access to opportunity for all residents," said Chicago Mayor Lori E. Lightfoot. "Second, Chicago is primed to become a major hub for quantum technology thanks to a number of notable investments which will have long term benefits for our entire region. Thus, in choosing to operate on Chicago's West Side, EeroQ will not only be well-positioned to capitalize on the many opportunities that quantum technology will drive, but also help to revitalize communities in need."
Story continues
"Chicago will become a leading center for quantum computing," said Nick Farina, EeroQ CEO. "Talent and local support are critical, and we have found all of that in Chicago with a world-class base of quantum computing talent and unmatched local support. We conducted a national search for a headquarters and Chicago was by far the best option for us."
EeroQ is thrilled to receive support from key stakeholders in Chicagos quantum computing revolution:
"IBT Group, Mansueto Office and the entire Terminal team are thrilled to kick off years of work with such an exciting visionary as Nick Farina and EeroQ. We appreciate their confidence in the project and our team and look forward to the Evolution of the Terminal as Chicagos premier creative work environment," said Gary Pachucki of IBT Group.
"Having EeroQ's headquarters in Chicago will pave the way for greater collaboration with our local community of scientists, engineers, and industry partners," said David Awschalom, Director of the Chicago Quantum Exchange. "EeroQ will be an asset to the region, and we look forward to welcoming them as part of Chicagos growing hub of quantum innovation and talent."
"We are thrilled with EeroQ's arrival," said Brad Henderson, CEO, P33. "As a trailblazer in the field, their presence underscores the existing strength of Chicago's quantum ecosystem and further establishes it as a center for the advancement and commercialization of quantum technologies. We look forward to working with them on the continued development of Chicago's quantum economy."
"On behalf of the business community, welcome EeroQ to Chicago," said Michael Fassnacht, President & CEO, World Business Chicago. "Here you will find an engaged, strong, and active community of innovators, disruptors, and business leaders excited to have you join us in seeing Chicago continue to expand into a global hub in quantum technology."
"Illinois has been at the forefront of computing for more than 70 years, and EeroQ is joining the next generation of Illinois computer science pioneers," said Intersect Illinois CEO Dan Seals. "We welcome EeroQ and look forward to watching the company grow as a part of Illinois thriving quantum computing ecosystem."
"This deal was incredibly complex based on EeroQs unique requirements and we are thrilled we could provide the optimal solution at The Terminal," said Emily Smith SVP Bespoke Commercial Real Estate. "This has been a significant step forward for Chicagos continued emergence on the national scale in the fields of quantum & biotech. We very much look forward to their growth and success here locally, and couldnt think of a better place for them to plant their flag."
View source version on businesswire.com: https://www.businesswire.com/news/home/20220106005374/en/
Contacts
Nicole Havemanmedia@eeroq.com +1-989-494-1202
Follow this link:
EeroQ Chooses the Terminal in Chicago for 9,600 Square Foot Quantum Computing Lab - Yahoo Finance
A Year After the Capitol Insurrection, Faith in Democracy Wanes – News @ Northeastern – News@Northeastern
A year after a mob of people stormed the U.S. Capitol in an attempt to disrupt the formal certification of President Joe Bidens electoral victory, Americans faith in the countrys electoral process and its democracy writ large have only fallen.
Large swaths of the electorate still think that the 2020 election was rigged, and over the last year, a modest number of people have indeed lost confidence in the fairness of that election, according to a new U.S. study.
Every single one of our predictions from January 2021 came true, says David Lazer, university distinguished professor of political science and computer sciences at Northeastern, and one of the principal investigators on the study. What were seeing is that while there isnt a ton of explicit support for storming the Capitol, theres a notable ambivalence among Republicans, driven in part by a very strong belief that the election was stolen.
David Lazer, distinguished professor of political science and computer and information science. Photo by Adam Glanzman/Northeastern University
Lazer is part of a team of researchers from Northeastern, Harvard, Northwestern, and Rutgers universities that comprise the Covid States Project. The team conducted a survey of public sentiment immediately after the Jan. 6 insurrection in 2021, and issued a follow-up study a year later.
The researchers surveyed 15,269 people in the U.S. across all 50 states plus Washington, D.C., between Dec. 22, 2021, and Jan. 5, 2022. What they found reveals stark differences between Democrats and Republicans, as well as a drop in Americans faith in its political institutions overall.
Over the last year, Republicans and independents became more ambivalent about the Capitol riotopposition to the event dropped by 11 percentage points among members of the GOP and by 8 percentage points among independents, but held steady among Democrats.
This warming attitude, Lazer says, may be explained by a parallel belief among voters that the 2020 presidential election was stolen from Republican candidate Donald Trump, who was the incumbent at the time. The latest data show that a modest number of people across all political affiliations3 percent to 4 percent of those surveyedexpressed less confidence in the fairness of the election this time around than they did last year.
But the belief is especially strong among Republicans and independents. Sixty-two percent of Republicans believe that if the votes had been fairly counted, Donald Trump would have won the 2020 election, while only 18 percent disagree with that sentiment. Independent voters lag only slightly behind54 percent believe Trump would have won the election if it were a fair contest, while 21 percent disagree.
Compare the figures to those of Democrats: Only 5 percent of Democrats surveyed believed Trump should have won, and 88 percent dont.
If you sincerely believe that the election was stolen, it becomes the predicate for lots of other things, Lazer says. At minimum, it becomes the predicate for taking equal and opposing action; at the extreme end of the continuum, it could become the predicate for the kind of violence we saw last year.
Looking back, Lazer says its clear that the particular circumstances of the 2020 election created a perfect storm that crystallized distrust of the democratic process among people who already may have had misgivings.
COVID-19 precautions meant a huge number of people voted by mail instead of in person, and those who mailed in their ballots bucked historical trends. In a given election, more Republicans typically vote by mail than Democrats; in 2020, it was Democrats who did so. As a result, Trump appeared to be in the lead on election night, before all the mail-in ballots had been counted.
Then there was Trump himself, whose stop the steal campaign culminated in a speech on Jan. 6, 2021, during which he posited again and again that Bidens victory was the result of a rigged election stolen by emboldened radical-left Democrats.
There was a sense in which you couldnt have written a political thriller like this because it just wouldnt have been plausible, Lazer says. There was already distrust in elections for some people, and this was like throwing gasoline on a fire. Everything just seemed to fit that narrative so neatly.
For media inquiries, please contact Shannon Nargi at s.nargi@northeastern.edu or 617-373-5718.
Original post:
Do the Legal Rules Governing the Confidentiality of Cyber Incident Response Undermine Cybersecurity? – Lawfare
When businesses suspect that they may have experienced a cyber incident, their first call is typically not to a cybersecurity firm, public relations outfit or even their cyber insurer. Instead, it is increasingly to a lawyer. These lawyersmany of whom market themselves as breach coachesthen coordinate all subsequent elements of the response to their clients potential cyber incident, including the efforts of the clients internal personnel and those of third-party cybersecurity and public relations firms that the lawyer hires directly. More than 4,000 cyber incidents in 2018 were handled in this manner. Similarly, the cybersecurity firm Crowdstrike reports that 50 percent of its investigations were directed by an attorney in 2020. This approach is accepted so widely that in-house attorneys explicitly recommend it in their professional publications and many cyber insurers provide policyholders with 800 numbers to call in the event of a cyber incident that go directly to an independent law firm rather than the insurer.
Lawyers pole position in coordinating cyber incident response is driven predominantly by their capacity to shield any information that is produced during that process from discovery in a subsequent lawsuit. Under long-standing case law, communications between consultants and attorneys who hire them to help provide legal advice to a client are shielded by the attorney-client privilege. Additionally, any documents and mental processes of third-party consultants that are produced in reasonable anticipation of litigationwhether or not they are communicated to the attorneyare similarly shielded from discovery under the work product immunity doctrine.
Putting lawyers, rather than technical security firms, in charge of data breach investigations can influence the incident response process in many ways, and its not entirely clear to what extent law firms emphasis on protecting attorney-client privilege and work product immunity alters the course of those investigations. We are an interdisciplinary group of researchersin law, political science and computer sciencewho are investigating this question. We are particularly interested in the prospect that these confidentiality doctrines have the potential to significantly undermine the efficiency and effectiveness of cybersecurity controls and processes. To look at this question, we are interviewing and surveying a broad range of participants in the cybersecurity ecosystemincluding breach coach lawyers, cyber-insurance personnel and digital forensic investigators.
Some of the potential distorting effects of attorney-client privilege and work product doctrine are well known, if only because they have played out so visibly in high-profile data breaches. For instance, several salient cases suggest that firms wishing to preserve the confidentiality of their post-breach efforts should consider launching dual investigations, with one focused on understanding the root causes of an incident and potential security solutions, and the other intended solely to facilitate the efforts of the companys lawyers. Doing so can limit the risk that post-breach assessments of legal and regulatory risks may be discoverable because they are combined with nonlegal materials, such as recommendations for improving future cybersecurity protocols. This was the strategy that Target employed when hackers stole 41 million payment card numbers from the retailer in 2013. In holding that the results of the second investigation were shielded from discovery in a subsequent class-action lawsuit, the court emphasized that this investigation was conducted solely for legal purposes. Not only does this approach have the obvious potential to inflate the costs of cyber incident response, but it may well undermine the effectiveness of such responses by creating confusion about the distinct responsibilities of the two investigative teams.
By contrast, when firms victimized by cyberattacks have tasked cybersecurity firms with both supporting their lawyers and helping them to shore up their technical defenses, courts have been much less willing to treat any resulting communications as privileged. This was the result when health insurer Premera hired security firm Mandiant to conduct a security audit, which detected a year-long breach that affected 11 million customers personal information. After the breach was discovered, Premera amended Mandiants statement of work and instructed it to report directly to its external counsel. In holding that Mandiants ultimate report was not protected by privilege, the court emphasized that Mandiant had been engaged prior to the discovery of the breach and that its report was not solely intended to provide legal advice. Documents, the court reasoned, prepared for a purpose other than or in addition to obtaining legal advice and intended to be seen by persons other than the attorney are not privileged.
Unlike Premera, Target was willing to go to extremeand expensivelengths to protect that attorney-client privilege in the aftermath of its 2013 breach, perhaps because it knew that the incident was likely to lead to litigation. But for many breached firms, paying for a dual-track investigation is costly and inefficient. Nor is it entirely clear that its a necessary step for preserving attorney-client privilege. A 2021 ruling held that a forensics report for a 2018 data breach of the Marriott hotel chain was privileged, even though the report was prepared by IBM, which had also provided pre-breach security services to Marriott. Though IBM had been working with Marriott since 2011, following the investigation, the company entered into a new statement of work with Marriott and BakerHostetler, the law firm the hotel chain retained to manage the breach investigation.
Our preliminary investigations suggest that attorney-client privilege and work product doctrine create potential distortions that may go much deeper than triggering occasional inefficient dual-track cyber-incident investigations. For instance, in the course of our initial conversations with participants in the cybersecurity ecosystem, we have learned that lawyers coordinating cyber-incident investigations routinely refuse to make forensic reports produced by cybersecurity firms available to cyber insurers. Such disclosure, these attorneys worry, could constitute a waiver of attorney-client privilege. Irrespective of the accuracy of this concernwhich has not yet been tested in courtthis practice may deprive insurers of potentially useful information that they could use to improve their underwriting processes or to advise other policyholders. Some attorneys, moreover, go even further, instructing their clients and cybersecurity firms not to disclose forensic reports to the clients internal information technology (IT) personnel, lest a court interpret that report to have been produced for business, rather than legal, purposes.
Some of our preliminary discussions suggest even more fundamental ways in which lawyers efforts to preserve confidentiality may undermine cybersecurity. For instance, some industry participants tell us that attorneys increasingly instruct forensic investigation teams not to record their findings in a written report at all, because of the potential that such a report could make its way into the hands of plaintiffs lawyers. Instead, forensic experts are instructed to explain the results of their investigations either via stripped-down PowerPoint presentations or through entirely oral presentations. This, of course, raises the prospect that any information communicated to clients that may allow them to improve their cybersecurity efforts in the future will not be fully understood by them or accurately communicated to others within the firm.
Similarly, the rules governing confidentiality appear to create the perverse incentive for firms to hire different security firms to run post-breach investigations from the ones that already provided pre-breach monitoring services. This reduces the speed of response as another firm must be engaged, contracted and provided with network access, all while an adversary has already infiltrated the targets networks. Further, the new firm may be unfamiliar with the network environment, often needing to navigate new software and IT portals to access monitoring tools and the corresponding logs.
Perhaps most perniciously of all, current rules may even disincentivize firms from taking proactive steps to conduct cybersecurity audits or other forms of monitoring. Since privilege and work product immunity attach only to documents produced when a firm reasonably anticipates litigation or communicates with attorneys to secure legal advice, these protections may not apply to materials produced to help detect a future breach. So companies may be less inclined to engage in those efforts directly or to hire cybersecurity firms to do so on their behalf. And even when they do, they may be reluctant to use the same firms for post-breach investigations that they hired for pre-breach monitoring, even if the firms coordinating pre-breach monitoring are more familiar with their computer systems and could conduct a faster forensic investigation.
Beyond distorting what information is documented and shared, current confidentiality rules create operational and business complexities. Because they place lawyers at the center of incident response, they cause law firms to charge large hourly fees, control communications, and even choose which forensics firms are hired. This disrupts established relationships and work patterns between internal IT firms and external cybersecurity vendors. In some cases this disruption may produce a variety of benefits that have nothing to do with confidentiality. For instance, some lawyers claim they are particularly adept at efficiently managing multiple work streams spanning technical investigation, ransomware negotiation, regulatory notifications, public relations and insurance. Others dispute these alleged benefits; some security professionals claim that centralizing communications through lawyers creates bottlenecks and delays, and even accuse lawyers of unmeritocratic hiring.
We are still working to understand the prevalence of these different practices for preserving attorney-client privilege, and their impact on the investigation process and findings. But policymakers, insurers and security researchers are all struggling to assemble reliable datasets about cyber threats and the effectiveness of different countermeasures. The Cyberspace Solarium Commission report issued in 2020 even recommended that Congress establish a new Bureau of Cyber Statistics specifically to collect statistical data on cybersecurity. So its worth considering how concerns about attorney-client privilege and work product doctrine may be contributing to those challenges by influencing the processes for investigating breaches, sharing and aggregating information about those breaches, and learning from past cybersecurity incidents.
Its not clear how big a problem confidentiality considerations are for cybersecurity investigations and data collection, so its hard to know what the right solution isor, indeed, if any solution is even needed. Jeff Kosseff has proposed the creation of a stand-alone privilege for cybersecurity work so that firms will be less reluctant to hire security professionals to assess and audit their computer systems. But its also possible that creating new privileges around cybersecurity could make it harder for people to sue firms in the aftermath of breaches, thereby limiting those firms accountability. On the other hand, it remains an open question how effective such lawsuits have been at incentivizing better cybersecurity practices.
The influence of attorney-client privilege and work product immunity on cybersecurity raises many more similarly open questions. It seems possible that the doctrines governing attorney-client privilege and work product have had the unintended consequences of undermining cybersecurity, information sharing about data breaches, and insurers ability to collect empirical data about cybersecurity incidents and the most effective countermeasures to prevent and mitigate those incidents. Given how central lawyers have become to breach response, and how high a priority maintaining confidentiality is for many of them, these questions are worthy of more study and attention as technical experts, policymakers and insurers all grapple with the best ways to learn from cybersecurity incidents. We would welcome any readers with experience on these issues to contact us directly so that we can learn more about how the laws governing attorney-client privilege and work product can promote, or undermine, effective cybersecurity.
Continue reading here:
Mr. Jeroen Tas Joins Zylorion as a Strategic Advisor and Observer to the Board of Directors – PRNewswire
CALGARY, AB, Jan. 6, 2022 /PRNewswire/ - PsiloTec Health Solutions Inc., operating as Zylorion, ("Zylorion" or the "Company"), a mental health care and psychedelic therapy focused innovator, is pleased to announce Mr. Jeroen Tas has joined the Company as a strategic advisor and will also act as an observer to the Board of Directors.
Mr. Tas is an innovation leader and entrepreneur with deep expertise in large scale digital transformationand a history of leveraging information technology to transform and grow businesses. Mr. Tas is the formerChief Innovation and Strategy Officer with Koninklijke Philips N.V. ("Philips") (NYSE:PHG;AMS:PHIA), aglobal health technology company, where he was instrumental in the transition of Philips to a customer-centric, digital health tech solutions company. Before joining Philips, Mr. Tas co-founded and served asPresident, COO and Vice-Chairman of the Board for Mphasis, a technology solutions company focused onservices for the financial industry, which was ultimately acquired by HP (EDS). Mr. Tas was also the formerhead of Transaction Technology Inc., Citi Bank's tech lab, responsible for the innovation and developmentof Citi Bank's customer-facing systems, where he oversaw the first launch of internet banking, payment networks and internet-basedself-servicedevices.
Mr. Tas is the 2004 winner of the E&Y Entrepreneur of the Year award in the Information Technology category for the New York region. Mr. Tas was also the recipient of the 2013 Dutch Chief Information Officer of the year award, the NASSCOM 2014 Global Chief Information Officer award, the World Innovation Congress 2014 Chief Information Officer Leadership award, the CIONet 2014 European Chief Information Officer award, the IT Executive 2014 award and the Accenture 2015 Innovator of the Year award. Mr. Tas is a native of the Netherlands and holds a Master's in Computer Science and Business Administration from the Vrije University, Amsterdam.
"I am delighted to be a part of the Zylorion organization and to be working with the world-renowned team ofexperts that Dr. Silverstone and the Board of Directors have been able to attract to the Company. In thisnascent industry, we as a team look forward to making an impact by developing new and innovativetreatmentsolutions,"commentedMr. JeroenTas.
In his role as a Strategic Advisor and Board Observer, Mr. Tas will provide strategic guidance and advise the development and delivery of the Company's clinical, and technology enabled therapy programs. Mr. Tas will also be acting as a non-voting observer to the Board of Directors. On August 13, 2021, the Company announced that it had entered into a non-binding letter of intent with Michichi Capital Corp. ("Michichi") in respect of a transaction which would, if completed, result in a reverse takeover of Michichi by Zylorion (the "RTO"). If the RTO closes, it is the current intention of the Company's Board of Directors to nominate Mr. Tas for election to the Michichi Board of Directors at Michichi's first annual general meeting following closing of the RTO. While Michichi and the Company continue to advance the RTO, the parties have not yet entered into a binding agreement with respect to the same.
"We are thrilled to have Mr. Tas join our organization and our mission. He brings a wealth of experience both as a successful entrepreneur and as a global technology leader. Mr. Tas has a proven track record of leading transformation change and creating sustainable shareholder value," noted Dr. Peter Silverstone, Chief Executive Officer & Director.
About ZylorionZylorion is a biopharmaceutical company engaged in the development and delivery of integrated mental health therapies to address psychological and neurological mental health conditions. Zylorion is focused on the research, development and commercialization of psychedelic-based compounds coupled with therapeutic treatment programs targeting a continuum of mental health conditions, such as MDD (major depressive disorder), TRD (treatment resistant depression), PTSD (post-traumatic stress disorder), general depression, anxiety disorders, and a number of addictive tendencies. Zylorion aims to leverage leading technologies to support the scalability and accessibility of its integrated therapy programs in its mission to enable those experiencing mental health challenges to thrive.
CautionaryNoteRegardingForward-LookingStatementsThis news release contains statements that constitute forward-looking information ("forward-looking information") within the meaning of the applicable Canadian securities legislation. All statements, other than statements of historical fact, are forward-looking information and are based on expectations, estimates and projections as at the date of this news release. Any statement that discusses predictions, expectations, beliefs, plans, projections, objectives, assumptions, future events or performance (often but not always using phrases such as "expects", or "does not expect", "is expected", "anticipates" or "does not anticipate", "plans", "budget", "scheduled", "forecasts", "estimates", "believes" or "intends" or variations of such words and phrases or stating that certain actions, events or results "may" or "could", "would", "might" or "will" be taken to occur or be achieved) are not statements of historical fact and may be forward-looking information.
SOURCE Zylorion
See the rest here:
Kerstin Perez is searching the cosmos for signs of dark matter – MIT News
Kerstin Perez is searching for imprints of dark matter. The invisible substance embodies 84 percent of the matter in the universe and is thought to be a powerful cosmic glue, keeping whole galaxies from spinning apart. And yet, the particles themselves leave barely a trace on ordinary matter, thwarting all efforts at detection thus far.
Perez, a particle physicist at MIT, is hoping that a high-altitude balloon experiment, to be launched into the Antarctic stratosphere in late 2022, will catch indirect signs of dark matter, in the particles that it leaves behind. Such a find would significantly illuminate dark matters elusive nature.
The experiment, which Perez co-leads, is the General AntiParticle Spectrometer, or GAPS, a NASA-funded mission that aims to detect products of dark matter annihilation. When two dark matter particles collide, its thought that the energy of this interaction can be converted into other particles, including antideuterons particles that then ride through the galaxy as cosmic rays which can penetrate Earths stratosphere. If antideuterons exist, they should come from all parts of the sky, and Perez and her colleagues are hoping GAPS will be at just the right altitude and sensitivity to detect them.
If we can convince ourselves thats really what were seeing, that could help point us in the direction of what dark matter is, says Perez, who was awarded tenure this year in MITs Department of Physics.
In addition to GAPS, Perez work centers on developing methods to look for dark matter and other exotic particles in supernova and other astrophysical phenomena captured by ground and space telescopes.
We measure so much about the universe, but we also know were completely missing huge chunks of what the universe is made of, she says. There need to be more building blocks than the ones we know about. And Ive chosen different experimental methods to go after them.
Building up
Born and raised in West Philadelphia, Perez was a self-described indoor kid, mostly into arts and crafts, drawing and design, and building.
I had two glue guns, and I remember I got into building dollhouses, not because I cared about dolls so much, but because it was a thing you could buy and build, she recalls.
Her plans to pursue fine arts took a turn in her junior year, when she sat in on her first physics class. Material that was challenging for her classmates came more naturally to Perez, and she signed up the next year for both physics and calculus, taught by the same teacher with infectious wonder.
One day he did a derivation that took up two-thirds of the board, and he stood back and said, Isnt that so beautiful? I cant erase it. And he drew a frame around it and worked for the rest of the class in that tiny third of the board, Perez recalls. It was that kind of enthusiasm that came across to me.
So buoyed, she set off after high school for Columbia University, where she pursued a major in physics. Wanting experience in research, she volunteered in a nanotechnology lab, imaging carbon nanotubes.
That was my turning point, Perez recalls. All my background in building, creating, and wanting to design things came together in this physics context. From then on, I was sold on experimental physics research.
She also happened to take a modern physics course taught by MITs Janet Conrad, who was then a professor at Columbia. The class introduced students to particle physics and the experiments underway to detect dark matter and other exotic particles. The detector generating the most buzz was CERNs Large Hadron Collider in Geneva. The LHC was to be the largest particle accelerator in the world, and was expected imminently to come online.
After graduating from Columbia, Perez flew west to Caltech, where she had the opportunity to go to CERN as part of her graduate work. That experience was invaluable, as she helped to calibrate one of the LHCs pixel detectors, which is designed to measure ordinary, well-known particles.
That experience taught me, when you first turn on your instrument, you have to make sure you can measure the things you know are there, really well, before you can claim youre looking at anything new, Perez says.
Front of the class
After finishing up her work at CERN, she began to turn over a new idea. While the LHC was designed to artificially smash particles together to look for dark matter, smaller projects were going after the same particles in space, their natural environment.
All the evidence we have of dark matter comes from astrophysical observations, so it makes sense to look out there for clues, Perez says. I wanted the opportunity to, from scratch, fundamentally design and build an experiment that could tell us something about dark matter.
With this idea, she returned to Columbia, where she joined the core team that was working to get the balloon experiment GAPS off the ground. As a postdoc, she developed a cost-effective method to fabricate the experiments more than 1,000 silicon detectors, and has since continued to lead the experiments silicon detector program. Then in 2015, she accepted a faculty position at Haverford College, close to her hometown.
I was there for one-and-a-half years, and absolutely loved it, Perez says.
While at Haverford, she dove into not only her physics research, but also teaching. The college offered a program for faculty to help improve their lectures, with each professor meeting weekly with an undergraduate who was trained to observe and give feedback on their teaching style. Perez was paired with a female student of color, who one day shared with her a less than welcoming experience she had experienced in an introductory course, that ultimately discouraged her from declaring a computer science major.
Listening to the student, Perez, who has often been the only woman of color in advanced physics classes, labs, experimental teams, and faculty rosters, recognized a kinship, and a calling. From that point on, in addition to her physics work, she began to explore a new direction of research: belonging.
She reached out to social psychologists to understand issues of diversity and inclusion, and the systemic factors contributing to underrepresentation in physics, computer science, and other STEM disciplines. She also collaborated with educational researchers to develop classroom practices to encourage belonging among students, with the motivation of retaining underrepresented students.
In 2016, she accepted an offer to join the MIT physics faculty, and brought with her the work on inclusive teaching that she began at Haverford. At MIT, she has balanced her research in particle physics with teaching and with building a more inclusive classroom.
Its easy for instructors to think, I have to completely revamp my syllabus and flip my classroom, but I have so much research, and teaching is a small part of my job that frankly is not rewarded a lot of the time, Perez says. But if you look at the research, it doesnt take a lot. Its the small things we do, as teachers who are at the front of the classroom, that have a big impact.
Here is the original post:
Kerstin Perez is searching the cosmos for signs of dark matter - MIT News
The age of hyper-accelerated AI adoption | Ctech – CTech
AI is one of the most powerful technological leaps humankind has ever made, driving a wide range of new technologies, services, and products weve never thought of. We live in the midst of an AI revolution that will keep transforming the world.
Artificial intelligence impacts every industry, business, and life of every human being on the planet. AI computers harness massive amounts of data and use their consistent evolving intelligence to make optimal decisions and discoveries in fractions of the time that it would take humans. Theres no major industry modern AI hasnt already affected or will affect soon. Thats especially true in the past few years, as thanks to the ever-increasing computing and processing power, startups and enterprises were able to ramp up their big-data analytics activities, create Machine Learning and AI based services and products, and more. As individuals, we already see artificial intelligence in our smart devices, cars, healthcare system and apps, and well continue to see its influence permeate deeper into many other industries for the foreseeable future.
The AI revolution is leading to the creation of a human-machine team working together. The machine processes amounts of data no human can process delivering insights to humans to make decisions no machine can make.
2021 has brought significant advances in AI and in the next year this momentum will drive humanity even further. Here is where I believe AI efforts in 2022 will be focused as they parse big data and look for new revenue opportunities:
Era of simulation and virtual worlds: AI-based computing platforms of today enable us to accurately simulate the real world and enable collaboration in this simulated virtual world (the much discussed Metaverse). These physically accurate models generate a high amount of synthetical data to train advanced AI models to much higher precision than possible today. Well also see advancing 3D standards for describing virtual worlds. Building accurate and rich Digital Twins counterparts to everything in the real world - airplanes, cars, factories, bridges, cities and even Earth itself - is one of the grand challenges in computer science. Many industries are starting to examine and adopt digital twins and virtual worlds, exploring the potential for operational efficiencies and cost savings. Everything we build in the real world will have a counterpart in the virtual world, enabling us to experience, test and optimize complex designs well before we commit to building them in the real world, and to tackle humanity's greatest challenges. That will result not just in faster product delivery, and billions of dollars saved, but will also allow us to deal with climate change and keep our planet safe.
AI will talk and understand us more than ever: The synergy of hardware and AI models made big leaps forward with NLP - Natural Language Processing. Speech synthesis for example, is poised to become just as emotive and persuasive as the human voice in 2022. This will help industries like retail, banking and healthcare to improve services for their customers. We already see today conversational bots being first line help agents answering calls and chats. New developments in the area of NLP will take it to the next level. Companies will race to deploy new conversational AI tools that allow us to work more efficiently and effectively using natural language processing. Companies using both speech and text for interaction with other businesses and customers will employ AI as they move to understand the context or sentiment of what a person might be saying. Is the customer frustrated? Is your boss being sarcastic? More and more computers will be able to identify these nuances, just as if they were humans.
Programmable cars: AI transforms the driving experience. In the next year well see more automakers creating software-defined architectures with headroom to support new applications and services via automatic over-the-air updates, just like our smartphones, meaning that vehicles will get better and safer over time. Automakers will also begin to use simulation to train and validate deep neural network models, like testing self-driving cars for a broad range of driving conditions which are too dangerous for a driver. Soon, we will also see cars that will start collaborating on the road to avoid traffic jams and prevent accidents.
Automate ordering and streamline customer service: AI will help to optimize complex processes like supply chain management, which became a critical area for retailers wanting to meet customer demand for product availability and faster delivery. AI will enable more accurate just-on-time forecasting, ensuring the right material is ready to manufacture the right product that needs to be at the right store and at the right time to be ordered and consumed through automated order taking systems Kiosks. Computer vision and robotics will add AI intelligence to distribution centers, which will lead the path to autonomous forklifts, robots, and intelligent multi-shuttle cabinets to reduce conveyor starvation and downtime and automate pick-and-pack of items to double throughput.
In supermarkets and big-box stores, retailers will increase their use of intelligent video analytics and computer vision to create automated checkouts and autonomous shopping. In 2022, well see many retailers offering hyper-personalized shopping experiences. In retail, stores AI will process massive amounts of data in real time to have a 360-degree view of their customers and provide more personalized offers and recommendations to drive better customer experience.
Michael Kagan is the Chief Technology Officer of NVIDIA
More:
2 Cryptocurrencies That Could Outpace Bitcoin – The Motley Fool
Bitcoin's (CRYPTO:BTC) growth during the past decade has started to attract more interest from institutions and big asset managers as a long-term investment.Despite the growth of other cryptocurrencies, such as Ethereumand Dogecoin, Bitcoin is still the largest by value, commanding 40% of the market.
But other cryptocurrencies have much greater utility than Bitcoin, especially in the realm of decentralized finance (DeFi), non-fungible tokens (NFTs), and much more. This is causing other coins to outpace Bitcoin, and 2022 could see that trend continue.
For those reasons and others, Avalanche (CRYPTO:AVAX) and Solana (CRYPTO:SOL) are two fast-growing cryptocurrencies that are worth buying.
Image source: Getty Images.
Avalanchewas developed by Ava Labs, a crack team of experts in computer science, economics, finance, and law.It launched in 2020 with a market cap of about $100 million.The price of AVAX soared more than 3,000% during the past year and now ranks as the 11th most valuable cryptocurrency, with a market cap of $25 billion.Based on recent developments, AVAX, which is the native cryptocurrency of Avalanche, could see more gains next year.
Recently, Deloitte announced a partnership with Ava Labs to use the Avalanche blockchain to support a new disaster-recovery platform for state and local governments.The platform is called Close As You Go and helps local governments "simplify and streamline disaster reimbursement applications to the Federal Emergency Management Agency (FEMA)." The Avalanche blockchain would help improve the accuracy of federal claims and reduce instances of fraud.
Avalanche ranks as the third most active cryptocurrency on Github, according to CryptoMiso, which indicates strong interest from project developers. Avalanche is getting more attention from developers and investors, primarily because Avalanche is much faster, less expensive, and more energy-efficient than Bitcoin.
More deals like Deloitte's could fuel positive sentiment and push AVAX's value higher, so consider adding a small position to your crypto wallet.
The price of SOL has risen even faster than AVAX, up 12,000% over the past year, giving it a market value of $54 billion, fifth most among cryptos.Cryptocurrencies are risky. It's possible to lose all your money, but if you invest what you're willing to lose, Solana's performance shows it's worth the risk of a small investment.
Solana was designed to be fast. It was developed using an innovative blockchain architecture known as proof of history, allowing Solana to potentially process as many as 710,000 transactions per second using today's hardware. That is many times faster than the biggest credit card networks, such as Visaand Mastercard.Solana currently processes about 2,000 per second,whereas Bitcoin typically handles around three transactions per second.
Given its speed, energy efficiency, and relatively lower transaction cost,Solana is quickly gaining interest from developers working on DeFi and NFT applications.
Solana is a good cryptocurrency to consider if you're interested in investing in the NFT boom. In case you're wondering, NFTs are basically virtual collectibles sold as digital tokens and can be traded like any cryptocurrency. Solanart and Solsea are two NFT marketplaces based on Solana,and there will likely be more.
The growing popularity of NFTsis one factor contributing to investor interest in Solana. It's a hot market that doesn't show signs of slowing down. Total NFT sales reached an estimated $26.9 billion in 2021.Several top brands, including Nike and toy company Mattel,have expressed interest in offering virtual collectibles for sale to NFT collectors. This doesn't necessarily benefit Solana directly, but it leads to positive sentiment around SOL, which could support its value.
Both Solana and Avalanche check all the right boxes of what to look for when investing in altcoins or any cryptocurrency besides Bitcoin. They have growing utility value, something that Bitcoin lacks -- beyond being the most widely held digital currency.
Keep in mind that no amount of analysis can guarantee which cryptocurrencies will go up in value. The right cryptocurrency could make you a millionaire, but be aware of the risks. The best way to proceed is to stay alert to recent developments that could benefit or hurt the adoption of any given cryptocurrency, invest what you can afford to lose, and focus on the long term.
This article represents the opinion of the writer, who may disagree with the official recommendation position of a Motley Fool premium advisory service. Were motley! Questioning an investing thesis -- even one of our own -- helps us all think critically about investing and make decisions that help us become smarter, happier, and richer.
Read more:
2 Cryptocurrencies That Could Outpace Bitcoin - The Motley Fool