Page 3,413«..1020..3,4123,4133,4143,415..3,4203,430..»

As for each physicists, there is a new quantum paradox, there is a casting of doubt on the pillar of reality… – The Queens County Citizen

A tree falls in a forest, and no just one is there to listen to it. You will under no circumstances uncover it producing a seem. On the place, a person arrives to listen to the sound. You certainly know a sound will be manufactured. You are in fact of a erroneous opinion then. On the contrary, there is a paradox in quantum mechanics.

As you may have appear across Einsteins concept of relativity, you could be really conscious of how it throws widespread-sense ideas on physical actuality. Provided the differentiation among Quantum mechanics and widespread sense. You need to have to know that when another person observes an event occurring, it genuinely happened. And it is attainable to make no cost choices as well as random choices.

Our option in one put is immediately influencing a distant event. As of intuitive thoughts, they are extensively thought by physicists. Those speaking of the analysis, the publication is now manufactured in Mother nature Physics. And quantum physics alone breaks into some level. And the strongest consequence has occur out of a extended collection of discoveries.

For the operating of Quantum physics, it is incredibly effectively to explain the conduct of very small objects. For very small objects, we imply atoms or particles. Our quantum theory however does not give responses to thoughts these as where is this particle ideal now?.For the principle is, the probability for where by the particle will be located will be noticed.

The founders of this principle consist of Neils Bohr theorem. You need to however know we absence info because physical properties like position dont truly exist right up until they are calculated. You must know this is the motive why particular homes of particles are not able to be beautifully measured. In the context of the entangled state, there is a consideration of the pair of distant particles in a point out.

Self-motivated reporter and producer with more than a decade of international television experience Organized visual thinker with superior attention to detail under tight deadlines Multilingual: Fluent in English and French.

Visit link:

As for each physicists, there is a new quantum paradox, there is a casting of doubt on the pillar of reality... - The Queens County Citizen

Read More..

Physicists may have found a way to create traversable wormholes – ZME Science

Credit: Pixabay.

The universal speed limit, which we commonly call the speed of light, is fundamental to the way the universe works. But human imagination knows no real limits. In science fiction, you often hear about wormholes, which are objects that enable faster-than-light travel by instantaneously transferring passengers from one point in spacetime to another.

Although the General Theory of Relativity and the Standard Model of Physics can theoretically support the existence of wormholes within their frameworks, they forbid traversable wormholes.

But physicists at Princeton flexed some serious mathematical muscles and found a loophole. By exploiting quirks of quantum mechanics within a five-dimensional universe, the researchers claim that it may be possible to create a wormhole large enough for humans and their spacecraft to travel through it and instantly emerge somewhere else at the other side of the universe. Alas, such a thing likely cannot exist in nature and an artificial traversable wormhole would be impossible to create with todays technology.

A wormhole or a Lorentzian wormhole is a sort of theoretical tunnel through space-time, often used as the preferred mode of interstellar travel in movies like Star Trek. The opening is a shortcut through intervening space to another location in the Universe. That seems to be in stark contrast to a black hole which is less of a tunnel and more of a meat grinder. However, some physicists claim that there are many characteristics which both black holes and wormholes share.

The existence of wormholes was first proposed by Karl Schwarzchild, whose solutions to Einsteins field equations form the basis for the inference of black holes. Sometimes, blackholes or blackhole binary systems may form connections between different points in spacetime.

The problem is that these wormholes collapse almost immediately, which would block matter from crossing from one end to the other. All hope isnt lost yet, though.

Juan Maldacena, the Carl P. Feinberg Professor of theoretical physics from the Institute of Advanced Study and Alexey Milekhin, a graduate student of astrophysics at Princeton University, wrote a new paper in which they discuss the conditions that may allow the existence of traversable wormholes.

In their paper, the two physicists outline some very exotic circumstances that may allow wormholes stable enough for humans to cross through. This includes the existence of negative energy, for instance.

In the theory of general relativity, we usually assume that the energy is greater than zero, at all times and everywhere in the universe. This has a very important consequence for gravity: Energy is linked to mass via the formula E=mc. So negative energy would consequently also imply negative mass. Positive masses attract each other, but with a negative mass, gravity could suddenly become a repulsive force.

Quantum theory, however, allows negative energy. According to quantum physics, it is possible to borrow energy from a vacuum at a certain location, like money from a bank. In their paper, the authors mention the Casimir Effect, in which quantum fields may produce negative energy while propagating along a closed circle.

However, this effect is typically small because it is quantum. In our previous paper, we realized that this effect can become considerable for black holes with large magnetic charge. The new idea was to use special properties of charged massless fermions (particles like the electron but with zero mass). For a magnetically charged black hole these travel along the magnetic field lines (In a way similar to how the charged particles of the solar wind create the auroras near the polar regions of the Earth), Maldacena and Milekhin explained to Universe Today.

These sort of wormholes would be allowed by the Standard Model of particle physics. However, they would be so tiny, not even a strand of human hair would have room to pass through. Whats more, they would only exist over equally tiny distances.

In order to support wormholes large and stable enough for humans to use, the physicists had to think outside the Standard Model box. The pair of researchers turned to the Randall-Sundrum II model, also known as the 5-dimensional warped geometry theory, which describes the universe in terms of five-dimensions.

This five-dimensional model of spacetime can enable scientists to describe physics that would normally be undetectable, allowing negative energy to exist.

The produced wormholes would look like medium-sized, charged black holes. They would be large enough for a spacecraft to travel through, however, the pilot would have to navigate very powerful tidal forces.

If the pilot could somehow navigate through this chaos, entering the wormhole would instantly propel the crew to another point in spacetime but the instant factor is only true from the perspective of the traveler. From the perspective of an outside observer, the travel would take as much time as light would take to travel from point A to point B, which is consistent with the Theory of General Relativity.

However, it would be next to impossible for this to work. Such wormholes do not exist in nature and artificially creating them would involve engineering negative mass.

In other words, its unlikely that wormholes will ever become a practical means of traveling through space. Nevertheless, its always fascinating to hear about concepts once solely thought to be of the realm of fiction that may actually be plausible.

The paper was published in the pre-print server arXiv.

The rest is here:

Physicists may have found a way to create traversable wormholes - ZME Science

Read More..

Bill and Ted ‘Face the Music’ in their latest movie, as we all have to eventually – NBC News

Bill and Ted of Excellent Adventure fame captured an era and an age in movies paying homage to the teenage slackerdom that defined a generation, which is why its somewhat jarring to see the franchise get a sequel some 30 years later but also so illuminating.

Despite what many fans might think, they're not stoner dudes in disguise. Nor are they full-on Valley or surfer-boy types.

Bill and Ted Face the Music, out Friday, does just as its title promises, exploring the life of characters that seemed too of-their-time to make a modern day resurgence but are actually all the more compelling for fully embracing their older selves as witnesses to what life, and history, gives us as it continues to unfold, ready for it or not.

In the third installment, Bill and Ted are no longer the happy-go-lucky kids they once were. (That role is reserved for their peppy daughters.) Indeed, the far-fetched time-travel plots of the films only reinforce the realities of true life.

The metalheads have grown up adrift in a world where their music hasn't been embraced despite the prophecy in earlier movies that it would be. Since there are plenty of real-life Gen Xers with bands or other pursuits that never made it and are still trying to figure out what to do, the goofiness of Bill and Ted speaks to the way in which many of the people who grew up watching them have themselves refused to grow up. The movies sense of instability is easy to relate to particularly as middle-age people today have found even stable professions shaken by the economic roller coaster of the last 20 years, culminating in the pandemic crisis.

Get the think newsletter.

But theres a flip side to the midlife malaise and joint couples therapy that Bill and Ted are stuck in, which also reveals a hidden truth for much of Gen X: They dont get the credit they deserve. Bill and Ted are slacker dudes hovering around 50 who have managed, despite their own cluelessness, to embark on successful time travel adventures and even cheat Death himself before he became one of their band members.

True, they generally succeed in life through dumb luck and a serendipitous confluence of events, but they're smarter than theyre perceived to be and often surprise us. Despite what many fans might think, they're not stoner dudes in disguise. Nor are they full-on Valley or surfer-boy types. Writers Ed Solomon and Chris Matheson confirmed this long ago, and Alex Winter and Keanu Reeves said they landed the roles because they didn't play them as stoners or airheads.

They're actually really sweet guys who are nice to everyone they encounter no matter how tweaked or crazy they are. (Except their evil alternate selves, whom they battle in the second film.) They are endearing characters because they are not malicious or angry. (OK, they get a bit testy in this one, but theyre 30 years older, dude.) Their amiability is a big reason why many fans will likely indulge in this sequel. And their approach to life could be what we need right now.

The major directive of the new installment, after all, is that Bill S. Preston, Esq. (Winter) and Ted Theodore Logan (Reeves) are the leaders of the band Wyld Stallyns, which includes their wives, Joanna and Elizabeth (Jayma Mays and Erinn Hayes), and have it in their hands to craft the epic song that brings about world peace and creates universal harmony. That was and is a very 80s notion. We could use some of that high-minded optimism in these dark times.

A quick recap on how they got to this point. In Bill and Ted's Excellent Adventure (1989), an emissary from the future, Rufus (portrayed by the late George Carlin), taught the two SoCal buddies how to travel through time so that they could collect personages of historical significance to ace their high school history report, graduate and move on to their loftier musical goals.

In Bill and Ted's Bogus Journey (1991), maniacal, mechanical clones from the future come to kill them and undo the predicted future success of their band. But with the help of Death (William Sadler) and a brilliant, heavenly alien named Station, they strike back at their alter egos. (With clever cinematic tributes to auteurs ranging from Ingmar Bergman to Tim Burton, Bogus is superior.)

In Face the Music, Rufus' daughter Kelly (Kristen Schaal) comes from the future to warn them that they have only 77 minutes to finally write that epic song that will unite everyone and everything, or the fabric of space and time as we know it will come undone. Most unfortunate.

To save everything in time, Bill and Ted flash forward into different points in the future to find where their future selves wrote the great song of universal harmony, if they can. (Time travel uses up real time minutes, so the race is on.) This time, their music-scholar daughters Theodora "Thea" Preston (Samara Weaving) and Wilhelmina "Billie" Logan (Brigette Lundy-Paine) journey separately back through the aeons to assemble the ultimate band including everyone from a Stone Age drummer to Jimi Hendrix for their most honorable dads.

Despite a slow-going first act, the movie and humor pick up once the time-travel segments kick in. (Post-credits tip: Stick around.) The cameo by grunge rocker Dave Grohl is fun, and who knew rapper Kid Cudi was so well versed in quantum physics? Further, modern digital effects show the future, the disrupted present and Hell in most outstanding fashion.

Underneath the silliness of it all, the Bill and Ted movies have always had a positive message about uniting people through the power of music.

Underneath the silliness of it all, the Bill and Ted movies have always had a positive message about uniting people through the power of music. It's been said that the films have a deluded sense of optimism, but at a time when Mike Judges prescient sci-fi comedy Idiocracy from 2006 in which our future country has literally become governed by idiots has become tragically realistic, we could use a dose of bodacious and nonheinous fun to help us lighten up a bit before things really do get cray-cray in the real world in 2020.

Face the Music isn't a classic for the ages, but it has its funny moments. Bill and Ted have never been about providing all the answers anyway. They're just here to show us the way, dudes.

Bryan Reesman is a New York-based reporter, author of the book Bon Jovi: The Story and host of the podcast Side Jams.

Read more:

Bill and Ted 'Face the Music' in their latest movie, as we all have to eventually - NBC News

Read More..

How to use Steams new chat filters to block profanity and slurs – The Next Web

Welcome toTNW Basics, a collection of tips, guides, and advice on how to easily get the most out of your gadgets, apps, and other stuff.

Today Steam revealed a new system called Text & Chat Filtering, which essentially allows you to customize what you see and dont see in private spaces. You can choose which words you see and dont see, even otherwise innocuous words that may be offensive or triggering to you. Well show you how.

Valve announced the new feature today via Steam Labs, its experimental space. Essentially, Steam bans profanity and slurs in public spaces, such as its forums, but is choosing not to do so for private spaces such as chat. This is to empower users to choose what they see from others, specifically marginalized groups [trying to] reclaim language for themselves.

But lets assume youre not trying to reclaim anything and would prefer Steam filter profanity privately as well. Heres how you do that.

At the moment, the feature is in beta, so youll need to join the experiment in Steam Labs. However, all signs indicate this is likely to come to Steam generally. To find it, go to your Account page, then go to Preferences. Scroll down, and under Community Content Preferences, youll find something that says Steam Labs Experiment 011: Text Filtering. Click Join the experiment. Note that you can leave the experiment any time you choose.

Youll immediately see several new options in this section. The first is whether you want to filter out profanity or slurs. You can chose to allow profanity but not slurs, or to obscure them with symbols if you choose. You can also choose not to do this for your Steam friends, as presumably some words will carry a different weight if they come from someone you know.

Next youll see the more granular controls over what language is filters. You can add words to filter individually, or you can upload a list of words. Ditto the other option: you can add words or upload a list of words thatll never be filtered. You can also download lists of both words. Speaking as someone who uses profanity as a way of communicating with family (we all swear like sailors), theres some profanity that doesnt bother me and some that does. So these options allow you more control over what youll see and not see.

And thats it! Now youll be able to control what you see in Steam chat more fully. Good luck!

Read next: I just got a COVID-19 test who now knows I got it?

Read the rest here:

How to use Steams new chat filters to block profanity and slurs - The Next Web

Read More..

Why Cloud Computing Stocks & ETFs Soared This Year – Yahoo Finance

Cloud is fast emerging as the new model of computing. And the pandemic has accelerated the move to the cloud.

These trends are likely to continue even after the crisis eases as many companies have already extended remote work policies.

The WisdomTree Cloud Computing ETF (WCLD) is the best performing ETF in the space, up about 68% this year. It tracks an equal-weighted index of emerging cloud companies

WCLD aims to provide a pure-play cloud exposure. Salesforce.com (CRM) is its top holding currently. The ETF charges 0.45% annually in fees and has about $638 million in assets.

To learn more about this and other cloud computing ETFs, please visit the ETF Center of Zacks.com.

Want key ETF info delivered straight to your inbox?

Zacks free Fund Newsletter will brief you on top news and analysis, as well as top-performing ETFs, each week.Get it free >>

Want the latest recommendations from Zacks Investment Research? Today, you can download 7 Best Stocks for the Next 30 Days. Click to get this free reportsalesforce.com, inc. (CRM) : Free Stock Analysis ReportWisdomTree Cloud Computing ETF (WCLD): ETF Research ReportsTo read this article on Zacks.com click here.Zacks Investment Research

Continued here:
Why Cloud Computing Stocks & ETFs Soared This Year - Yahoo Finance

Read More..

Why cloud computing isn’t the answer to all business problems – Verdict

The Covid-19 pandemic has driven the rhetoric around cloud computing into overdrive. However, there is a right size for everything, including the corporate commitment to cloud.

When public cloud computing first hit, early adopters and pundits were quick to say it was the future, full stop. Later on, as the market has continued to mature, there is now a second generation adding to the chorus of all-cloud computing.

General cloud and SaaS technologies have a lot going for them. Between simplification, ease of access for employees not on site, and the reduction of equipment and software that IT staff have to maintain, cloud technologies can do wonderful things for a business.

But the absolutist position that everything has to go into the cloud is one that most companies should avoid. Because no matter how accommodating and easy to use cloud is, it represents a loss of control. Decisions made by a third party, or even mistakes made by a third party can have detrimental effects. For a lot of functions, considering the good track record of the major cloud suppliers to date, this is a pretty easy risk/benefit calculation to make.

However, things that are core to the company, things that have a direct effect on serving the customer, such as manufacturing control for instance, require a different risk/benefit calculation. Systems that are absolutely critical to the core function of a company require a more hands-on approach, where keeping these functions in house, or at least redundant in house should be considered.

The rise of Edge Computing, systems designed to be near the end point to handle fast response and local data processing is the perfect example of the cloud model not really being one-size-fits-all. Edge Computing wouldnt exist otherwise. Locations with limited or poor connectivity are another good example of how cloud cannot do it all.

Companies can and should embrace cloud computing and in some cases even move the majority of their IT functions to the cloud. However, the core tenant of digital business is improved customer experience and engagement. Viewed in that light, cloud might be the solution for many problems, but not all problems.

GlobalData is this websites parent business intelligence company.

Continue reading here:
Why cloud computing isn't the answer to all business problems - Verdict

Read More..

How to manage the risk of cloud sprawl with centralised management – Cloud Tech

The public cloud is designed to enable agility, scalability, and adaptivity. The result is a massive proliferation of public cloud services and APIs offered by cloud platform providers, which help organisations increase the rate of innovation far beyond they could in the private data centre.

While these new cloud services deliver attractive benefits, they also increase management complexity and the potential for costly misconfiguration errors that can compromise critical resources and sensitive information.

Public cloud environments effectively include an infinite number of configuration combinations, offering a long list of opportunities for misconfigurations. Users may also change settings at any time, introducing a misconfiguration where there was not one previously. Where with automation these misconnfigurations can rapidly be duplicated. For the public cloud, misconfigurations are a serious problem, which is why Gartner recently claimed that 99% of cloud security failures are the customers fault.

According to another report, most cloud misconfigurations are the result of inexperienced users or a failure to update security tools designed to do things like monitor and validate configurations. Nearly 40% occur due to efforts to merge data during M&A activities. Other times, a cloud storage bucket is left open to the public, enabling anyone to access itno special hacking skills or tools required. Perhaps the most famous example is that of the US National Security Agency, where a cloud-based server was left open, and security documents were freely accessible using just a web browser.

Other security events tied back to misconfiguration include:

The results of a cloud misconfiguration can be devastating. Between 2018 and 2019, more than 33 billion records were exposed as companies moved to the cloud without having appropriate security in place. During that time, the number of records exposed by cloud misconfigurations rose by 80%, as did the total cost to companies associated with those lost records.

Addressing this challenge needs to start with prioritising secure access to cloud resources, especially business-critical applications. Access is especially relevant since the number of people touching the cloud infrastructure has dramatically increased over the past few years. In the past, only a handful of people in an organisation, primarily DevOps teams, had access to the cloud infrastructure. Today, deploying applications and making engineering changes to a given infrastructure has become far more common.

Of course, keeping track of which applications are the most critical may seem straightforward. But as applications multiply, and cloud usage evolves, assumptions about which applications are the most valuable can often be wrong. To keep up, organisations need to be monitoring for increases in application usage over time so that the most critical applications are not only prioritised to ensure availability and optimal user experience, but that access is controlled so that errors can be kept to a minimum.

To effectively detect and remediate intrusions and protect critical services, security teams also can benefit from the ability to view the current inventory of all cloud resources through a single console. That way, they can monitor and analyse traffic and drill down on specific services and traffic patterns that are suspicious. Specifically, they need to be able to visualise traffic to effectively distinguish between valid and threatening traffic.

Now is an excellent time to put in place a central cloud security management system where you can streamline security operations across multiple clouds while interfacing with fewer touchpoints, ultimately enabling consistent visibility across a broad cloud environment. In a static, on-premise environment, such issues can be addressed using a configuration management database (CMDB). But rapid changes to cloud services and configurations introduce new challenges. Management systems isolated to per-cloud instances inevitably results in siloed visibility. Add to that the dynamic nature of cloud deployments, and it can become nearly impossible for an organisation to consistently assess its security posture. This is a root cause of many critical misconfigurations that occur in increasingly complex cloud infrastructures.

A centralised cloud security management system needs to include a common framework for security policies to be deployed and orchestrated across a multi-cloud environment. These tools are typically categorised as Cloud Security Posture Management, Cloud Workload Protection and Cloud Access Security Brokers. These tools need to perform configuration analysis, event analysis, compliance checks, and data inspection, regardless of the cloud environment in which a solution is deployed, along with offering remediation recommendations. Even better, those configurations should be able to be compared against industry standards, such as PCI, HIPAA, or NIST, to ensure ongoing compliance and adherence to best practices.

Just as importantly, these cloud security capabilities need to be able to see and communicate with each other, regardless of where they are deployed. This requires tools such as cloud connectors that can translate and normalise data, policies, and enforcement protocols on the fly. This not only ensures the collection of critical security data from across the distributed cloud but also enables resources to be marshaled as part of a unified response to a detected threat.

Cloud sprawl can quickly result in significant risk to any organisation that does not step up to take proactive measures. A truly centralised cloud security management solution must not only integrate natively into each cloud platform it is deployed, but also serve as a central point of truth covering the entire distributed infrastructure. This includes the breadth of services utilised in the cloud from IaaS, through PaaS to SaaS applications and resources operating both on and off the network.

Such a strategy plays a critical role in preventing issues like misconfigurations and shadow IT that plague organisations and put entire digital operations at risk, no matter the size of the company.

Photo byukasz adaonUnsplash

Interested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend theCyber Security & Cloud Expo World Serieswith upcoming events in Silicon Valley, London and Amsterdam to learn more.

Link:
How to manage the risk of cloud sprawl with centralised management - Cloud Tech

Read More..

AWS, Azure or Google? Six steps to decide the best cloud for SAP implementations – Cloud Tech

Analysis I have worked on SAP engagement for enterprise customers for almost 30 years. While SAP continued to develop application features and content during these years, generally the hosting of SAP fell into the same classic selection process as any application: what is the service level, what is the cost, who has the happiest customers?

However, with the advent of public cloud (IaaS), those tried and tested criteria no longer give customers an accurate evaluation of each option. As such, here are some of the key criteria SAP customers need to include in their evaluation of hosting options.

Of course, cost comes first in most scenarios. Nothing happens in an enterprise without a good business case. However, at first glance, negotiated costs can be deceiving. Enterprise agreements, short-term discounts, migration funding and more can all muddy the waters when it comes to getting a clear perspective of the pricing you are signing up for. In order to best predict what future costs will look like, its important to understand the hyperscalers attitude towards cost, and then extrapolate their pricing history.

Additionally, with hyperscaler infrastructure comes the great benefit of metered charging, where you only pay for what you use resulting in variable costs. While this is actually a very good thing in general, it can cause headaches for procurement and necessitate new processes for IT to properly manage these variable costs. When selecting a provider, you need to understand which hyperscaler/partner can best help you see and control ongoing metered costs.

Nowadays, we expect public cloud to be more resilient than on-premise. And, while this is generally true, not all clouds are equal especially for applications such as SAP. You will need to evaluate the amount of downtime each hyperscaler has experienced over the last 12-18 months to get a sense of how they compare. SLAs are one thing historic performance is a much better guide.

Publicly published statistics on hyperscaler downtime show that AWS fares far better than Azure and better or similar to Google Cloud Platform. SAP, as we know, is very sensitive to downtime especially unplanned downtime. Choosing the most stable platform is an important part of the selection criteria for all your systems but particularly for SAP given its criticality to the business.

The best innovation is happening in the cloud these days and, as everything is or will be in the cloud eventually, innovation and speed to innovation needs to be an integral part of your IT road map for the next 10 years at least. Right now, AWS is the leader in getting new innovations and new ideas to the market quickly. Azure categorizes itself as fast followers, which is an important but safer position in the market. Google, while very good at what they do around data items and other categories, does not display the same customer obsession and innovation focus in its cloud capacities as its competitors.

Why does this matter? When looking at innovation, particularly the speed of innovation, you need to also consider the technology adoption cycle. This is the timeframe from when the new technology is introduced to when it is ultimately retired. When the adoption cycles of innovation among hyperscalers reach a one to two-year difference, this becomes a critical differentiator. Some would say that right now, AWS is already one to two years ahead of its competitors meaning that the technology that gets introduced by them will release, runs its cycle and be retired by the time it gets to other cloud providers. Selecting the most innovative platform is critical for any long-term strategic decisions.

AWS has always led the way on the most performant technology, both on storage and compute. What AWS has done recently is launch all of their instances based on their nitro hypervisor which takes all the hypervisor load off the VM and allows the workloads to get access to all of the resource on compute. Nitro was, in effect, an add-on component to every VM. This allows for unparalleled performance.

Additionally, AWS is innovating into its own chips and own chip design, and is releasing its G Class of families which have already shown to be not just cheaper but higher performing than other chip providers. This gives every indication that AWS will continue to lead the way on performance. When running SAP, one of the biggest complaints end users typically have is a lack of performance. Overall performance and performance when you need it is one of the biggest benefits IT departments can give to their customers so choosing the most performant platform for your systems is table stakes.

Another benefit of public cloud is that it has an open API. This means that it is a publicly available application programming interface so developers in offices (and garages and living rooms) all over the world are coding. This is an example of hyperscalers and their partner ecosystems adding a significant amount of additional innovation that their customers can access directly. As a result, we consistently see brand new use cases for BI, speech, chatbotting and other great technologies that can integrate very simply with public cloud.

This proves once again that public cloud is a platform best suited for future innovation. It also suggests that the amount of innovation is directly related to the number of partners that hyperscalers have as part of their ecosystem. AWS has, by far, the most, and that is very important if you want to have access to these third-party capabilities. You will probably find that they enable these capabilities for AWS before any other hyperscaler. The more customers that are on a platform, the more partners will do development there which means the more traction there is for the new customers. This network effect is something that AWS has done well for 10 years and is something that is very difficult for others to catch up on.

Ultimately, automation is the most important secret ingredient of them all. Automation not only allows you to do things automatically, remotely and quickly, but also with more quality. Quality builds for installed software systems, like SAP, are essential.

Classically run on-premise, most people spend their time trying to keep the lights on for SAP and maintaining and fixing things manually. The downside of that is people make mistakes. Manual steps are inherently risky, and you could end up with situations where Dev might have a different kernel patch version than COS, which might have a different version than Production. Suddenly, you get unexpected defects when you run workloads on production. The way to avoid this is through automation. Automation will remove the manual errors and ensure that there is a repeatable and reliable process for both the build and maintenance of the SAP landscape. This higher quality ensures that you can reduce the noise in the environment and reduce the amount of work and cost to maintain the system.

Another plus of automation is the agility it enables. Suddenly, you can do things faster. So, when users want a system refresh, or a restore from a backup, or to patch a system, these things can now be done much more quickly. And, with automation, you can surface it into a portal that will allow end users or project team members to self-serve on the maintenance of the landscape. This agility delivers satisfaction to the project team as they can try out new ideas quickly. This is, of course, the fundamental premise of innovation the ability to try something quickly, fail at it fast or, if it does work, promote it quickly into production. If you want to innovate, you need to be agile, and if you want to be agile you must automate.

Picture credit: SAP

Interested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend theCyber Security & Cloud Expo World Serieswith upcoming events in Silicon Valley, London and Amsterdam to learn more.

Continued here:
AWS, Azure or Google? Six steps to decide the best cloud for SAP implementations - Cloud Tech

Read More..

IBM has built a new drug-making lab entirely in the cloud – MIT Technology Review

The news: IBM has built a new chemistry lab called RoboRXN in the cloud. It combines AI models, a cloud computing platform, and robots to help scientists design and synthesize new molecules while working from home.

How it works: The online lab platform allows scientists to log on through a web browser. On a blank canvas, they draw the skeletal structure of the molecular compounds they want to make, and the platform uses machine learning to predict the ingredients required and the order in which they should be mixed. It then sends the instructions to a robot in a remote lab to execute. Once the experiment is done, the platform sends a report to the scientists with the results.

IBM RESEARCH

Why it matters: New drugs and materials traditionally require an average of 10 years and $10 million to discover and bring to market. Much of that time is taken up by the laborious repetition of experiments to synthesize new compounds and learn from trial and error. IBM hopes that a platform like RoboRXN could dramatically speed up that process by predicting the recipes for compounds and automating experiments. In theory, it would lower the costs of drug development and allow scientists to react faster to health crises like the current pandemic, in which social distancing requirements have caused slowdowns in lab work.

Not alone: IBM is not the only one hoping to use AI and robotics to accelerate chemical synthesis. A number of academic labs and startups are also working toward the same goal. But the concept of allowing users to submit molecules remotely and receive analysis on the synthesized molecule is a valuable addition of IBMs platform, says Jill Becker, the CEO of one startup, Kebotix: With RoboRXN, IBM takes an important step to speed up discovery.

See the article here:
IBM has built a new drug-making lab entirely in the cloud - MIT Technology Review

Read More..

UTAR and Alibaba Cloud Sign MoU to Transform Cloud Computing Education – QS WOW News

Universiti Tunku Abdul Rahman (UTAR) and Alibaba Cloud (Malaysia) Sdn Bhd officially signed a memorandum of understanding (MoU) on 11 August 2020 at Sungai Long Campus to transform the cloud computing education offerings for students and staff.

The MoU is part of the Alibaba Cloud Academic Empowerment Program (AAEP) developed for local universities. It aims to provide advanced cloud computing technology for students and staff. With easy access to high-quality learning resources, UTAR students can pursue Alibaba Cloud certification and stand a chance to intern the company after the successful completion of the course.

The collaboration also aims to empower digital talents and tech professionals through Alibaba Clouds Elastic Compute Service (ECS) and Data Transfer courses, overhauling the current Cloud Computing curriculum at the university. Both parties will jointly promote cloud computing by conducting collaborative seminars, guest lectures, workshops and training activities to lay a strong technical foundation for the young generation.

Attendees of the MoU were UTAR President Ir Prof Dr Ewe Hong Tat, Malaysia Alibaba Cloud Intelligence General Manager Jordy Cao, UTAR Vice President for Internationalisation and Academic Development Ir Prof Dr Yow Ho Kwang, Alibaba Cloud (Malaysia) Sdn Bhd Country Marketing Leader Angie Ng, Division of Community and International Networking Director Assoc Prof Dr Lai Soon Onn, Alibaba Cloud (Malaysia) Sdn Bhd Associate Online Marketing Manager Elaine Ooi, Faculty of Information and Communication Technology (FICT) Deputy Dean of Academic Development and Undergraduate Programmes Ts Dr Cheng Wai Khuen, FICT lecturer Ts Tan Teik Boon and staff.

Speaking at the ceremony, Prof Ewe said, We are indeed privileged and honored to be working together with Alibaba Cloud Malaysia Sdn Bhd, a global leader in cloud computing and artificial intelligence which provides reliable and secure cloud computing and data processing capabilities. On behalf of the University, I would like to take this opportunity to express my heartfelt appreciation to Alibaba Cloud Malaysia because this collaboration will help us prepare our students for the Digital Cloud Transformation journey.

We are proud to be part of the Alibaba Cloud Academic Empowerment Program and look forward to this collaboration for greater educational benefits for our students as well as information and knowledge exchanges between Alibaba Cloud and UTAR.

He added, When cloud technology was introduced, it became the trend of the future. It is so convenient for us to access all the required data at any place and anytime without carrying bulky laptops with us.

On the other hand, Jordy Cao expressed his gratitude to UTAR for the warm welcome and great hospitality. He gave a brief introduction to Alibaba Cloud by introducing their well-known cloud computing services.

He said, It is pivotal that students and the teaching staff are getting the best and latest cloud computing curriculum, as well as access to experienced professionals to help them validate and clarify their theoretical knowledge.

Alibaba Cloud has been dedicated to providing the best-in-class cloud services to our customers and we will deliver exactly these experiences to UTAR to help students get the best learning resources in the industry so they can better seize the opportunities provided by the digital era.

He emphasized, It is a good start for everyone to practice new norms in the post-COVID-19 world. With the experience of Alibaba Cloud, I strongly believe that students and teachers will greatly benefit from this.

Read this article:
UTAR and Alibaba Cloud Sign MoU to Transform Cloud Computing Education - QS WOW News

Read More..