Page 2,677«..1020..2,6762,6772,6782,679..2,6902,700..»

Microsoft Hires Former Top Amazon Cloud Executive, Adding to Rivalry – The Wall Street Journal

Microsoft Corp. hired a top Amazon.com Inc. cloud veteran and ally of Chief Executive Andy Jassy, according to a company document, in a rare jump of a senior executive between the cloud-computing industrys leading rivals.

Charlie Bell, who left Amazon Web Services recently, is now listed by Microsoft as a corporate vice president, according to an internal employee register viewed by The Wall Street Journal. Mr. Bell was a senior vice president at Amazon Web Services, as the companys cloud-computing arm is known, and for years worked with Mr. Jassy, then the AWS boss, to build up the cloud-computing business.

Mr. Bell spent more than two decades at AWS and was widely considered an heir apparent to Mr. Jassy. When Mr. Jassy was named to replace Jeff Bezos as Amazons CEO, the company went outside for his replacement. In March, Amazon tapped Adam Selipsky to become the cloud businesss next chief executive.

Mr. Selipsky also had a history at AWS and close ties to Mr. Jassy. He was one of the first vice presidents hired in 2005 to work at the cloud-computing business, remaining there for about 11 years and rising to run marketing, sales, support, business development, partner alliances and international expansion. He left in 2016 to become CEO of the data-analytics platform Tableau Software Inc., which was acquired by Salesforce.com Inc. for more than $15 billion, before his return to Amazon.

Mr. Bells exact role at Microsoft couldnt immediately be determined. His move to Microsoft was previously reported by Business Insider.

Read more here:
Microsoft Hires Former Top Amazon Cloud Executive, Adding to Rivalry - The Wall Street Journal

Read More..

How Can You Unlock the Cost Benefits Of Cloud Computing? – Entrepreneur

With the pay-as-you-go model and scalability of the cloud, one is typically billed on the usage of resources, which can scale up or down according to demand

Stay informed and join our daily newsletter now!

August24, 20214 min read

Opinions expressed by Entrepreneur contributors are their own.

You're reading Entrepreneur India, an international franchise of Entrepreneur Media.

As more and more organizations are embracing cloud computing, one of the major driving factors behind this adoption is the proposed cost savings. The pandemic has resulted in furthering this adoption and driving accelerated digital transformation at enterprises. A study by NASSCOM on Indias public cloud market reported a 15-20 per cent reduction in operational costs for SMEs in India along with a 20-25 per cent boost in productivity.

Lets take a look at how cloud computing can help you with cost reduction for your business. While going through the list, it is important to keep in mind that the cost impact varies for every organization depending on their IT objectives, current technology infrastructure, type of applications, etc., and that a proper IT strategy is crucial to realize the cost benefits of the cloud.

Flexible payment model

With the pay-as-you-go model and scalability of the cloud, you are typically billed on the usage of resources, which you can scale up or down according to their demand. This allows you to save on costs for under-utilized resources. Cloud also requires lower initial capital investment into software and licenses as the upfront cost of the cloud is less than in-house solutions. You are able to eliminate the costs of running data centres, maintenance, hardware replacement costs and power issues. With increased flexibility and mobility of the cloud, your employee productivity is enhanced, which further drives efficiency and growth.

Optimum energy usage

When your servers are running 24/7, your energy consumption costs can become outrageously high. Coupled with inefficient server utilization for in-house systems, your energy usage can lead to extortionate energy bills. Moving to the cloud means eliminating the need for an in-house data centre. With huge economies of scale and efficient power usage, cloud providers can charge you significantly less for the systems and resources utilized. It is challenging for in-house manpower to manage large scale operations with ease. Cloud providers with their purpose-built data centres, timely upgrades and optimum server utilization can help you save your time and money. Also, with no extra costs, all users can stay up to date with the system operations.

Accelerated business agility

Cloud presents an excellent platform for experimentation leading to faster innovation for your organization. The adoption of cloud computing promotes rapid advancement with the faster development of future products, solutions, business channels, models and resilient supply chains. Moreover, the cloud enables quicker, data-driven decision-making and business continuity, helping your organization become future-ready. All of this will propel business growth and lead to higher revenue for your enterprise. This will have a multiplier effect on your business and boost the overall cost-effectiveness.

Reduced capital expenditure

There is no requirement of any in-house infrastructure as the entire network and storage can be made accessible from cloud servers. You dont have to spend money purchasing licences for a limited period as the cloud providers will be responsible for allocating resources, both hardware and software that suit your companys needs. Furthermore, the cloud provides the ability to expand your resources in the future without additional cost in purchasing in-house equipment.

Enhanced productivity and efficiency

Cloud allows your employees to spend less time on deployment and more time working towards application development and new business initiatives. Your workforce can easily access applications, data and systems on the cloud from anywhere, enabling remote work. Cloud platforms also allow the processing of huge amounts of data and the application of advanced technologies such as artificial intelligence and machine learning. It is expensive and complicated to execute these technologies on physical IT infrastructure. The automation and innovation that such cutting-edge solutions built exclusively on the cloud bring to your enterprise will contribute to making your business more cost-effective.

Restructured IT teams

Eliminating in-house maintenance and infrastructure requirements will remove the need for extensive IT personnel. This decrease in workload saves IT costs on recruitment, and with a smaller IT team, you can focus more on management and redirect your team to work on other business areas. For smaller organizations without technical skills and bandwidth to assign IT workloads, cloud computing can become an effective solution for their IT requirements.

You will be able to realise these cost benefits only with proper strategy, planning and management framework around cloud adoption. Moving to the cloud without taking into consideration your business goals and current infrastructure will not convert into cost savings for your organization. Building and executing the right plan of action will enable you to enjoy cost-effectiveness, flexibility and many other benefits of the cloud.

Cloud adoption has become an essential step in every enterprises digital transformation journey. The right IT expertise and guidance from a technology partner can simplify this process and help you achieve cost savings for your business.

See the article here:
How Can You Unlock the Cost Benefits Of Cloud Computing? - Entrepreneur

Read More..

The 6 Best Cloud Storage Courses on LinkedIn Learning to Take in 2021 – Solutions Review

The editors at Solutions Review have compiled this list of the best cloud storage courses on LinkedIn Learning that IT professionals should consider if theyre looking to grow their skills.

Cloud storage services have gained popularity over the years due to their increased scalability. In recent years, concerns regarding accessibility, security, and data transfer rates in the public cloud fueled demand for hybrid and private cloud storage. Public cloud storage services are adaptable and enable users to store and sync their data to an online server. When data and files are stored in the cloud instead of a local drive, they are available on multiple devices, allowing users to access and modify files from different computers and mobile devices. Additionally, cloud data storage can function as a backup system for hard drives.

With this in mind, the editors at Solutions Review have compiled this list of the best cloud storage courses on LinkedIn Learning to consider taking. The platform is perfect for those looking to take multiple courses or acquire skills in areas, or for those who want the most in-depth experience possible through access to LinkedIn Learnings entire course library or learning paths. In sum, LinkedIn Learning offers training in more than 13 distinct categories with thousands of modules. This list of the best cloud storage courses on LinkedIn Learning below includes links to the modules and our take on each.

Note: Courses are listed in no particular order.

Description:Storage is one of the most widely utilized cloud-computing services. Companies are eager to take advantage of object-based storage with unlimited scalabilityand IT professionals are the ones who need to make it work. This course covers the basics of cloud storage, including storage planning, budgeting, and security. Instructor David Linthicum demos account creation and management with Amazon Web Services, but the lessons are high level and applicable to almost any cloud solution. Learn about file, block, and object storage; explore planning and requirements gathering; and review three practical use cases for cloud storage, including one featuring logging and other storage management subsystems.

GO TO TRAINING

Description:Google Cloud remains in the top three cloud platforms, and storage remains a high priority for individuals and businesses. In this course, instructor Mark Johnson covers the fundamentals of storage on the Google Cloud platform (GCP). Mark begins with an overview of object storage on GCP. He dives into both relational and non-relational databases, with useful explanations and demos. During the course, Mark addresses GCP services like Firestore, Bigtable, and Cloud Spanner. He wraps up with an introduction to data warehousing and analytics.

GO TO TRAINING

Description:Azure Storage is a Microsoft cloud storage solution for modern data storage scenarios. Azure Storage offers a massively scalable object store for data objects, a file system service for the cloud, a messaging store for reliable messaging, and a NoSQL store. This course gives a broad level of knowledge on the overall landscape of Azure Storage, as well as an introduction to all the concepts. Instructor Charbel Nemnom focuses on providing hands-on knowledge to employ and manage data storage strategies on Azure effectively, and the latest and greatest features of Azure storage has to offer for IT professionals and developers. In particular, you can explore Azure Blob Storage, Azure Files, and Azure Queues, as well as security, import and export, and backup services for Azure Storage. By the end of the course, youll have everything you need to build and manage your own Azure Storage solutions.

GO TO TRAINING

Description:Amazon Web Services offers solutions that are ideal for managing data on a sliding scalefrom small businesses to big data applications. This course teaches system administrators the intermediate-level skills they need to successfully manage data in the cloud with AWS: configuring storage, creating backups, enforcing compliance requirements, and managing the disaster recovery process. The training can also be used as preparation for the Data Management domain within the AWS Certified SysOps Administrator exam.

GO TO TRAINING

Description:Discover how to properly scale applications on the Amazon ecosystem. This course shows IT pros how to use AWS network and data storage design scalability services, techniques, and tools. Throughout the course, instructor Lynn Langit covers scaling networks around server-based and serverless application architectures, as well as scaling files and data. Learn how to use migration tools such as Server Migration Service, explore scaling considerations for hybrid scenarios, and review how best to scale common architectures. This course can also prepare you for the Network Design domain in the AWS Certified Solutions Architect Professional exam.

GO TO TRAINING

Description:Earning the AWS Certified Solutions Architect Associate (SAA-C02) certification validates your ability to architect and deploy applications on AWS technologieswhich, in turn, can set you apart from others in your field. This course is part of a nine-part series thats been completely updated for the March 2020 version of the qualifying exam. Tom Carpenter goes over the fundamentals of AWS storage design, covering key services AWS professional needs to know about. After providing an overview of the different storage services offered by AWS, he delves into Amazon Simple Storage Service (S3)the primary file or object storage service in AWScovering basic S3 storage and management concepts. He also discusses additional AWS storage services, including Glacier, Elastic Block Store (EBS), Elastic File System (EFS), and FSx.

GO TO TRAINING

Tess Hanna is an editor and writer at Solutions Review covering Backup and Disaster Recovery, Data Storage, Business Process Management, and Talent Management. Recognized by Onalytica in the 2021 "Who's Who in Data Management." You can contact her at thanna@solutionsreview.com

Related

Follow this link:
The 6 Best Cloud Storage Courses on LinkedIn Learning to Take in 2021 - Solutions Review

Read More..

7 benefits of Cloud Computing in the telecom industry – Telecom Lead

In the past decade or so, cloud computing has gained immense popularity and it has made its impact on almost every sector, whether it is business, technology, health care, or any other sector. All of this has led to an increase in spending on Cloud Computing globally and more & more businesses are now implementing cloud computing into their operations.The main advantage of cloud computing is that you only get to pay for the cloud services that you are using which reduce the operating costs and help you to run your business more cost-effectively.

As mentioned, cloud computing has made an impact in almost every industry, and one of the industries where cloud computing has made an enormous impact is the telecom industry. In the telecom industry, cloud computing has helped to reduce the operational and administrative costs along with a unified communication and collaboration system and a massive Content Delivery Network. Cloud Computing has allowed the telecom service providers to focus on the essential things that actually matter for this business rather than focusing on IT things such as server maintenance, server updates, etc.

Lets have a look at some of the vital benefits of Cloud Computing on the Telecom Industry:

With Cloud Computing, Network Providers can provide software at much lower rates with the help of virtualization of the software on the remote servers that allows easy allocation of the computing resources as required and thus it also helps to reduce the hardware costs as well. This will help to generate more revenue as compared to the costs and it will be basically like Gold Rate, revenue getting higher and higher every day.

A very useful feature of cloud computing is that you have the flexibility to allocate resources when required. Cloud Computing allows the users to scale up or scale down the resources such as IT, Server, Storage, Network, etc when required.

The flexibility and stability of cloud computing make it possible for the network providers to meet the peak loads and seasonal variations demands. When there is a peak and the load is high then they can easily scale up the resources from the cloud service provider and it will result in being able to handle the peak loads without any problem or downtime.

The different Cloud Delivery Models available can help deliver IT and Communication services over any network whether it is fixed, mobile, or requires worldwide coverage.

Through Cloud Computing, a vast range of communication services can be delivered hassle-free such as audio calls, video calls, video conferences, messaging, broadcasting, etc.

With the help of cloud computing, it has become much easier for both the cloud service provider and telecom companies to collaborate. The advent of cloud computing has led to improved data centers efficiency and server utilization.

This is another very useful thing about cloud computing that the migration costs are very low. If the customers of a service provider are not satisfied with the cloud services that they are being provided then it is quite easy for the providers to migrate to a new solution by terminating the previous contract, signing a new contract, and then transferring or migrating the user data to the new servers, retaining the customers and making them happy at minimal costs.

Cloud Computing offers extensive data backups that allow the companies to backup, store, and secure the critical data of the customers on multiple cloud servers at a time so in case one server fails then the data will be available on the other servers and it will allow the service provider to carry on with the business immediately.

One of the biggest benefits of cloud computing is that the user data remains secure and it is also able to predict if a server is going to fail or not like a Horoscope. If the server does fail then it wouldnt be an issue since the data is already backed up. Whereas if it was the conventional method of computing and networking then in case of a natural disaster all the data would be lost.

If we were to compare cloud computing with any other technology then obviously it would be cloud computing that would be considered eco-friendly since it puts a very minimal load on the environment in terms of carbon emission.

These were only some of the benefits that cloud computing can provide to the telecom industry. In conclusion, cloud computing has had a huge impact on various industries and sectors including both public and private enterprises. The majority of businesses are using cloud services to help their business grow and meet the constantly changing demands of their customers. The Telecom Industry is also witnessing massive growth with the implementation of cloud into their business operations.

View post:
7 benefits of Cloud Computing in the telecom industry - Telecom Lead

Read More..

EDA in the Cloud: Key to Rapid Innovative SoC Design – Eetasia.com

Article By : Mahesh Turaga, Cadence Design Systems

More and more companies are turning to EDA in the cloud as they gradually overcome concerns about security and IP protection.

Simultaneous mega-trends are shaping multiple industries from aerospace and defense, automotive and high-tech to healthcare and others. These include 5G, autonomous vehicles, industrial internet of things (IIoT), electrification, hyperscale computing and artificial intelligence / machine learning (AI/ML). Add cloud to the mix, and we have another generational disruption that has driven business over the past decade and been further accelerated by our current global situation, changing the way we work, live, communicate and entertain. Cloud opportunities go far beyond flexible ubiquitous access.

In the preceding decade, the move towards cloud computing occurred primarily in sectors like retail and finance, with the advent of leading cloud vendors such as Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform and others accelerating the trend. In the electronic design automation (EDA) space, until recently, traditional concerns about security, protecting intellectual property (IP) and data outweighed the significant advantages offered by computing in the cloud such as flexibility, scalability and productivity.

That is now changing, and the cloud-enabled value of each of those industries is driving the need for intelligent systems. We now see that the only way such systems can be created is by using cloud-enabled computing tools and methods. It is the systems that are driving the need for massively parallel computing with close-to-linear performance growth and virtually unlimited scalability while maintaining the highest level of accuracy. Those results are possible only in the cloud. With leading foundries in the space adopting cloud and acknowledging the security of cloud infrastructure by having their process design kits (PDKs) in the cloud, security concerns have by and large diminished.

Multiple companies involved with EDA and systems development from electronics to mechanical design are increasingly recognizing the transformative power of the cloud and cloud computing to deliver fast-paced innovation to their end consumers, enabling users to come up with new business models that would not have been possible previously.

New startups driving innovation in emerging industries such as electrification are at the forefront of cloud adoption as a way to collaborate globally, deliver innovation at lightning speed and compress the traditional notions of product life cycles from decades to a few years whether its coming up with a new electric vehicle or an eVTOL. These companies are proving to the broader EDA and systems space that the cloud is so much more than a delivery mechanism. The cloud is an accelerator of innovation, business reinvention and lightning growth.

To innovate faster in a digital transformation scenario, the five key pillars of innovation in an enterprisepeople, data, processes, technology and tools are reimagined in more efficient ways by a broadside cloud adoption. The advances brought by hyperscalers make this transformation even more seamless to companies of all sizes, from small and medium businesses to large enterprise, systems and semi companies. The question of cloud adoption is then, not about cost savings or even about technology per se, but about a fundamental business model transformation that companies can achieve to better serve their end customers.

Collaboration, innovation and cloud computing

In the EDA space, companies can benefit by collaborating in the cloud seamlessly across groups and geographies, thereby accelerating innovation from the chip level to the systems level. Chip designers can collaborate on complex chip and SoC designs instantaneously and communicate in real time with their mechanical counterparts through ECAD-MCAD collaboration in the cloud.

Real-time, dynamic collaboration across these two key domains enables many of the companies to come up with the most complex, next-generation cyber-physical products and systems that are at the core of the current generational trends. Industry leadership must take steps towards a next-generation cloud platform that enables this seamless collaboration across different disciplines, from the chip level to the systems level, from electronics to mechanical domains, thus enabling a true digital twin.

Beyond internal collaboration, industry collaboration is simplified and expedited via the cloud. Cloud collaboration brings together the foundry team, design team and EDA team for advanced process node adoption, proliferation and support. Machine learning (ML) is now required to overcome the technical and EDA tool flow complexity for processes below 7nm. And the cloud provides the necessary compute throughput for these ML workloads.

The cloud as a computer

The theoretically unlimited compute capacity in the cloud is another driver of this innovation brought upon by the cloud. While many large companies invested in on-premises server farms and today find it more economical to continue to run locally, theres an increasing realization that a simple cost comparison is too short-term.

Thinking long-term and reimagining how moving to the cloud will transform their businesses and their end customers in the next 5, 10 or 15 years is going to reshape the dynamics in EDA going forward. Smaller and medium-sized companies are already increasingly moving to the cloud to take advantage of the unlimited compute capacity that helps them deliver complex SoCs in months versus years while collaborating seamlessly across remotely distributed teams.

This trend is going to accelerate in the coming years, and we are seeing users adapt to this reality. But they require a seamless transition to the cloud, new business models on the cloud, with more sophisticated collaboration capabilities.

Along with the move to the cloud come the challenges of optimizing a companys compute needs in the cloud as a way to not only control costs but also to democratize innovation across the enterprise. Those taking advantage of EDA in the cloud are increasingly demanding business models that are prevalent in retail to run anything from a logic simulation to the most complex chip emulation to large multiphysics simulations on demand.

While the last decade was mostly about retail business transformation via the cloud, the next decade is going to be about the transformation of electronics and mechanical systems companies due to cloud computing.

This article was originally published onEE Times.

Mahesh Turaga is VP of business development for cloud at Cadence Design Systems Inc.

Related

View post:
EDA in the Cloud: Key to Rapid Innovative SoC Design - Eetasia.com

Read More..

The Impact of Remote Working on Cloud Development – Amico Hoops

Remote work is actually a concept that has been around for a long time. Some lucky employees are able to work from their homes or wherever they want and spend time on sites like Yukon Gold casino Canada at the same time. However, due to the COVID-19 pandemic and developments in cloud technology, the number of remote workers has increased very quickly. This way of working has proven beneficial for both companies and employees. Even now, many employees (and companies) want to continue working remotely. So, what is the relationship between this new way of working and cloud technology? Can we expect more people to work remotely in the near future and even make it the default way of working? Weve answered these questions for you below.

We need to start by giving basic information first because, in order to explain the relationship between these two concepts, we must first define what they are. Remote working is a flexible working model that does not require an employee to be in the office to do their job. It is not specific to a particular sector, and it is possible to work in this way in any business line that does not require being physically present in the office. We know that white-collar employees, in particular, spend their time in front of the computer in the office. Another thing we do know is that they dont have to be in the office to do this. They can do the same from anywhere, as long as they have access to a computer and an internet connection. If you dont need to go to the office to do your job, you are a good candidate for remote work.

It is possible to define cloud technology as an online storage service that can be accessed with web-based applications that do not require installation. We can store all our applications, programs, and data on the Internet in a virtual machine (i.e., in the cloud) and access this data from anywhere with any device. The technology that enables us to do this is called the cloud.

In fact, these definitions alone are enough to understand how the two concepts affect each other. Cloud computing was not created for remote working, but this working model expands its boundaries by taking advantage of this technology and makes it possible to do things that were not possible before.

As we mentioned above, remote working is not a new way of working, but the jobs that could benefit from this privilege were quite limited. Working remotely was often an option exclusive to the IT department because other departments were not able to do things like sharing documents and holding meetings online, for example.

We briefly explained what cloud technology is above. This technology enables information stored online to be accessed from anywhere. In other words, if you need to send last months reports to the department manager, you dont need to be in the office to do it. You can even do it using a service like Google Sheets, which is publicly available.

Similarly, increased internet speeds have made it possible to host online meetings: now its enough to connect to the same platform instead of being in the same room. In other words, cloud technology expanded the scope of remote work and changed even the lines of business that could not work in this way before. Likewise, remote working has also affected cloud technology. The types of services and storage options covered by this technology continue to change and evolve according to the needs of companies. Five years ago, it was very difficult for 20 employees to come together in an online meeting room and share documents. For example, multiple applications were required to do this, and these applications did not work as efficiently. However, advances in cloud technology have made it possible to do this seamlessly with a single application today.

Cloud technology and remote work are interconnected concepts: they constantly influence and improve each other. Cloud developers are finding new ways to do more and more types of work remotely. Employees, on the other hand, can work and share data from anywhere, not just from their homes.

In 2019, even before the pandemic started, 1.5 million people were working remotely in the United Kingdom. In the United States, 3.2% of the workforce (approximately 4.3 million people) was working remotely. There is no institution that has officially announced the 2021 data, but this figure is estimated to reach 26.7% of the workforce during the 2020-2021 period. Some analysts expect more than 35 million Americans to work remotely by 2025, and these numbers keep growing. 97% of remote workers do not want to go back to the office after the pandemic. All this data shows us that remote working is not a temporary solution, and cloud technology will continue to support this way of working.

Read this article:
The Impact of Remote Working on Cloud Development - Amico Hoops

Read More..

Who is Andy Jassy? Amazon’s new CEO visits the White House – Yahoo Finance

Newly minted Amazon (AMZN) CEO Andy Jassy faces his first major public test as the head of the tech behemoth Wednesday, as he joins fellow tech CEOs to discuss national cybersecurity concerns with President Joe Biden.

Jassy, who took over as Amazon CEO from founder Jeff Bezos on July 5, wont be the most well-known name in the room on Wednesday, especially since hell be next to Apple (AAPL) CEO Tim Cook, Microsoft (MSFT) CEO Satya Nadella, and Alphabet (GOOG, GOOGL) CEO Sundar Pichai. But thats likely to change soon.

Jassy, 53, may not be a household name like Bezos, but the Scarsdale, New York-born executive is far from an unknown in tech circles. Jassy has been with Amazon since 1997, having joined the company after graduating from Harvard Business School. That same year, Jassy married Elana Caplan, with whom he has two children.

While he worked in marketing for the e-commerce side of Amazons business, his time spent with Bezos set him up for his eventual role as CEO of Amazons most profitable business: Amazon Web Services.

Jassy helped turn what was initially a proprietary computer system for Amazon into the premiere cloud computing service used by everyone from the U.S. government to Netflix. In its last quarter, AWS saw operating income of $4.19 billion compared to Amazons e-commerce business, which had operating income of $3.50 billion.

Andy Jassy, CEO Amazon Web Services, speaks at the WSJD Live conference in Laguna Beach, California, U.S., October 25, 2016. REUTERS/Mike Blake

Jassy is known as a sharp executive who pays attention to fine details. He regularly read press releases related to AWS before they were sent out, people familiar with the process told Insider.

While head of Amazons crown jewel, Jassy showed support for social justice issues, calling for changes to policing in the U.S. following the killing of Breonna Taylor in 2020, and declaring his support for the Asian American community in the country amid an uptick in hate crimes.

Story continues

Still, hes also defended Amazons sale of its controversial facial recognition technology, which was previously found to misidentify individuals of color. The company put a one-year moratorium on selling the software to police departments in the wake of protests following the murder of George Floyd. In April, the company extended the moratorium indefinitely.

Jassy has come into his position at a time of major upheaval for Amazon. Not only is the company facing stiffer competition in the cloud space from the likes of Microsoft and Google, but its also under investigation for possible antitrust violations.

Amazon has already been sued by Washington, D.C. Attorney General Karl Racine, who accused Amazon of violating the District of Columbia Antitrust Act by forbidding third-party sellers from offering cheaper rates for their products on competing websites.

Amazon is one of the largest employers in the U.S., and after years of complaints from warehouse workers, labor unions are beginning to take action. The International Brotherhood of Teamsters recently announced it will begin working to organize Amazon workers, an effort that could succeed where an earlier campaign to unionize a warehouse in Bessemer, Alabama, failed.

Despite these challenges, Jassy will likely still have Bezos guidance. The founder and now spaceman will continue to serve as chair of Amazons board of directors, meaning hell have ultimate say over big-picture topics ranging from how the company responds to legal matters to whether it will launch new products.

How Jassy, Bezos, and Amazon will assist the government in ensuring that it and critical infrastructure companies have the cybersecurity capabilities needed to fend off cyberattacks is still up in the air. But as the largest cloud provider in the country, it will likely play an important role.

Sign up for Yahoo Finance Tech newsletter

More from Dan

Got a tip? Email Daniel Howley at dhowley@yahoofinance.com over via encrypted mail at danielphowley@protonmail.com, and follow him on Twitter at @DanielHowley.

More here:
Who is Andy Jassy? Amazon's new CEO visits the White House - Yahoo Finance

Read More..

UW, Carnegie Mellon to pioneer platforms that harness astrophysical data to unravel the universe’s mysteries – UW News

Research | Science | UW News blog

August 25, 2021

Image of the Rubin Observatory summit facility in Cerro Pachn, Chile.Rubin Observatory/NSF/AURA

The University of Washington and Carnegie Mellon University have announced an expansive, multiyear collaboration to create new software platforms to analyze large astronomical datasets generated by the upcoming Legacy Survey of Space and Time, or LSST, which will be carried out by the Vera C. Rubin Observatory in northern Chile. The open-source platforms are part of the new LSST Interdisciplinary Network for Collaboration and Computing known as LINCC and will fundamentally change how scientists use modern computational methods to make sense of big data.

Through the LSST, the Rubin Observatory, a joint initiative of the National Science Foundation and the Department of Energy, will collect and process more than 20 terabytes of data each night and up to 10 petabytes each year for 10 years and will build detailed composite images of the southern sky. Over its expected decade of observations, astrophysicists estimate the Department of Energys LSST Camera will detect and capture images of an estimated 30 billion stars, galaxies, stellar clusters and asteroids. Each point in the sky will be visited around 1,000 times over the surveys 10 years, providing researchers with valuable time series data.

Scientists plan to use this data to address fundamental questions about our universe, such as the formation of our solar system, the course of near-Earth asteroids, the birth and death of stars, the nature of dark matter and dark energy, the universes murky early years and its ultimate fate, among other things.

Tools that utilize the power of cloud computing will allow any researcher to search and analyze data at the scale of the LSST, not just speeding up the rate at which we make discoveries but changing the scientific questions that we can ask, said Andrew Connolly, a UW professor of astronomy, director of the eScience Institute and former director of the Data Intensive Research in Astrophysics and Cosmology Institute commonly known as the DiRAC Institute.

The Rubin Observatory will produce an unprecedented data set through the LSST. To take advantage of this opportunity, the LSST Corporation created the LSST Interdisciplinary Network for Collaboration and Computing, whose launch was announced Aug. 9 at the Rubin Observatory Project & Community Workshop. One of LINCCs primary goals is to create new and improved analysis infrastructure that can accommodate the datas scale and complexity that will result in meaningful and useful pipelines of discovery for LSST data.

Many of the LSSTs science objectives share common traits and computational challenges. If we develop our algorithms and analysis frameworks with forethought, we can use them to enable many of the surveys core science objectives, said Rachel Mandelbaum, professor of physics and member of the McWilliams Center for Cosmology at Carnegie Mellon.

Connolly and Mandelbaum will co-lead the project, which will consist of programmers and scientists based at the UW and Carnegie Mellon, who will create platforms using professional software engineering practices and tools. Specifically, they will create a cloud-first system that also supports high-performance computing systems in partnership with the Pittsburgh Supercomputing Center, a joint effort of Carnegie Mellon and the University of Pittsburgh, and the National Science Foundations NOIRLab. The LSST Corporation will run programs to engage the LSST Science Collaborations and broader science community in the design, testing and use of the new tools.

The complete focal plane of the LSST Camera is more than 2 feet wide and contains 189 individual sensors that will produce 3200-megapixel images.Jacqueline Orrell/SLAC National Accelerator Laboratory/NSF/DOE/Rubin Observatory/AURA

The LINCC analysis platforms are supported by Schmidt Futures, a philanthropic initiative founded by Eric and Wendy Schmidt that bets early on exceptional people making the world better. This project is part of Schmidt Futures work in astrophysics, which aims to accelerate our knowledge about the universe by supporting the development of software and hardware platforms to facilitate research across the field of astronomy.

Many years ago, the Schmidt family provided one of the first grants to advance the original design of the Vera C. Rubin Observatory. We believe this telescope is one of the most important and eagerly awaited instruments in astrophysics in this decade. By developing platforms to analyze the astronomical datasets captured by the LSST, Carnegie Mellon University and the University of Washington are transforming what is possible in the field of astronomy, said Stuart Feldman, chief scientist at Schmidt Futures.The software funded by this gift will magnify the scientific return on the public investment by the National Science Foundation and the Department of Energy to build and operate Rubin Observatorys revolutionary telescope, camera and data systems, said Adam Bolton, director of the Community Science and Data Center at NSFs NOIRLab. The center will collaborate with LINCC scientists and engineers to make the LINCC framework accessible to the broader astronomical community.

Through this new project, new algorithms and processing pipelines developed at LINCC will be able to be used across fields within astrophysics and cosmology to sift through false signals, filter out noise in the data and flag potentially important objects for follow-up observations. The tools developed by LINCC will support a census of our solar system that will chart the courses of asteroids; help researchers to understand how the universe changes with time; and build a 3D view of the universes history.

Our goal is to maximize the scientific output and societal impact of Rubin LSST, and these analysis tools will go a huge way toward doing just that, said Jeno Sokoloski, director for science at the LSST Corporation. They will be freely available to all researchers, students, teachers and members of the general public.

Northwestern University and the University of Arizona, in addition to the UW and Carnegie Mellon, are hub sites for LINCC. The University of Pittsburgh will partner with the Carnegie Mellon hub.

See the original post here:
UW, Carnegie Mellon to pioneer platforms that harness astrophysical data to unravel the universe's mysteries - UW News

Read More..

ISS eyes missions to Moon and Mars with edge, cloud and open source IT – Verdict

Since its launch into space earlier this year, the Spaceborne Computer-2 (SC-2) on the International Space Station (ISS) has been enabling several important experiments, including the crop cultivation projects and efforts to monitor astronaut health.

These experiments showcase the potential for edge computing, cloud computing and open source software to advance space research and play a key role in missions to explore the Moon, Mars and beyond. These experiments have involved technology from IBM, Red Hat, Hewlett Packard Enterprise, and Microsoft, also illustrating the benefits of a collaborative approach by big tech to space exploration.

SC-2 is based on Hewlett Packard Enterprises Edgeline EL4000 converged edge computing systems, which combine data processing and data storage capabilities with analytics software. SC-2 is designed to process data obtained from hundreds of data-collecting sensors and instruments, including satellites and cameras.

The computers real-time data processing capabilities, which include Graphics Processing Units (GPUs) for artificial intelligence/machine learning workloads, significantly enhance the ability to learn from and act on data collected in space.

Traditionally, data collected in space had to be sent to Earth for analysis on hard drives a process that would take weeks. Now, the ability to process data collected locally on the space station means that only data that requires deeper analysis needs to be sent to Earth.

This is where cloud computing comes into play. Americas space agency, NASA has teamed-up with Microsoft and IBM, whose cloud data centers on Earth are used to store and facilitate further processing and analysis of data collected in space. IBM for example is involved in an initiative to analyse DNA sequencing data, having developed a custom edge computing solution that runs on HPEs SC-2.

The solution leverages Red Hat CodeReady Containers, which wrap analytical code in tiny digital containers. These are then transmitted to IBMs cloud data centers on Earth where researchers develop, test and prepare code to be pushed back to the ISS.

The DNA sequencing project, called Genes in Space 3, is used for identifying microbes on the ISS. This can benefit multiple initiatives, including monitoring astronaut health, or discovering possible infections on the space station. The DNA research currently being carried out on the ISS is expected to play a key role in future missions, including NASAs Artemis program, which envisages the establishment of a sustainable presence on the Moon en route to Mars.

Another technology already supporting space exploration, and one that will benefit future missions to the Moon and Mars is open source, a type of computer software that allows anyone to study, develop, and share the software code. NASA has been using open source in some of its R&D projects for at least a decade: previous initiatives involving open source include the Mars Ingenuity Helicopter, while future initiatives include VIPER (Volatiles Investigating Polar Exploration Rover), which in 2023 will sent to the Moon to search for water.

The benefits of open source software to space exploration are now widely recognised among them is the potential to combine diverse skills and perspectives to tackle complex challenges.

Used together, edge computing, cloud computing and open source promise to transform the way space exploration exploits the power of data, enabling a more efficient and rapid way of processing, analysing, sharing and acting on data. Ultimately this will ensure that space exploration is safer, an essential requirement for longer missions into outer space.Related Report Download the full report from GlobalData's Report StoreGet the Report

Latest report from Visit GlobalData Store

See the original post here:
ISS eyes missions to Moon and Mars with edge, cloud and open source IT - Verdict

Read More..

Sangfor Featuring Gartner Report on Network Detection and Response: Using AI to Combat AI Purpose-Built AI Models in NDR – Business Wire

HONG KONG--(BUSINESS WIRE)--Sangfor Technologies, a leading vendor of cyber security & cloud solutions, works closely with customers and IT analyst firms like Gartner, to develop new products and improve their already stellar offering of security, cloud computing and infrastructure solutions.

One of the recent Gartner research publications, Emerging Technologies: Emergence Cycle for AI in Security for Malware Detection, drills-down into how AI is enabling the world of network security and network detection and response (NDR).

Gartner research resulted in several interesting findings including:

The Gartner research includes recommendations on how to incorporate AI into malware detection in areas of relevance including CASBs, EDR, SWGs and WAFs. Gartner focused on dividing their research into subgroups: endpoints, performance monitoring, modelling, encryption, ransomware, and code analysis.

Based on this research, and with its years of security experience in the IT industry, Sangfor has released a detailed whitepaper that explains how to combat weaponized AI with purpose-built AI models looking for specific small non-normal or suspicious behaviour over magnitudes of activity across large periods of time. NDR tools have become very popular for threat detection and automating responses against threats because they are using AI to find small malicious behaviour from analysis of large amounts of network traffic.

For more detailed information and to read Sangfor & Gartner report in its entirety, click HERE: https://connect.sangfor.com/gartner-ndr-whitepaper!

Sangfor Cyber Command NDR functionalities have purpose-built AI models that are proven to significantly improve overall security detection and response capabilities. By monitoring internal network traffic, correlating security events and user behaviour analysis, and adding in global threat intelligence, Cyber Command reveals breaches, and, through careful analyses, identifies hidden threats within the network.

For more information on Sangfor Cyber Command, or to research Sangfors other cyber security, cloud computing and infrastructure solutions, please visit us online, or email us at marketing@sangfor.com directly, and trust Sangfor to make your digital transformation easier and more secure.

Source: Gartner Research NoteEmerging Technologies: Emergence Cycle for AI in Security for Malware DetectionPublished 27 October 2020, By Analyst(s): Nat Smith, Rustam Malik, 27 October 2020GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission. All rights reserved.

More:
Sangfor Featuring Gartner Report on Network Detection and Response: Using AI to Combat AI Purpose-Built AI Models in NDR - Business Wire

Read More..