Category Archives: Cloud Servers
Supermicro Expands GPU System Portfolio with Innovative New Servers to Accelerate a Wide Range of AI, HPC, and Cloud Workloads – PRNewswire
"Supermicro engineers have created another extensive portfolio of high-performance GPU-based systems that reduce costs, space, and power consumption compared to other designs in the market," said Charles Liang, president and CEO, Supermicro. "With our innovative design, we can offer customers NVIDIA HGX A100 (code name Redstone) 4-GPU accelerators for AI and HPC workloads in dense 2U form factors. Also, our 2U 2-Node system is uniquely designed to share power and cooling components which reduce OPEX and the impact on the environment."
The 2U NVIDIA HGX A100 server is based on the 3rd Gen Intel Xeon Scalable processors with Intel Deep Learning Boost technology and is optimized for analytics, training, and inference workloads. The system can deliver up to 2.5 petaflops of AI performance, with four A100 GPUs fully interconnected with NVIDIA NVLink, providing up to 320GB of GPU memory to speed breakthroughs in enterprise data science and AI. The system is up to 4x faster than the previous generation GPUs for complex conversational AI models like BERT large inference and delivers up to 3x performance boost for BERT large AI training.
In addition, the advanced thermal and cooling designs make these systems ideal for high-performance clusters where node density and power efficiency are priorities. Liquid cooling is also available for these systems, resulting in even more OPEX savings. Intel Optane Persistent Memory (PMem) is also supported on this platform, enabling significantly larger models to be held in memory, close to the CPU, before processing on the GPUs. For applications that require multi-system interaction, the system can also be equipped with four NVIDIA ConnectX-6 200Gb/s InfiniBand cards to support GPUDirect RDMA with a 1:1 GPU-to-DPU ratio.
The new 2U 2-Node is an energy-efficient resource-saving architecture designed for each node to support up to three double-width GPUs. Each node also features a single 3rd Gen Intel Xeon Scalable processor with up to 40 cores and built-in AI and HPC acceleration. A wide range of AI, rendering, and VDI applications will benefit from this balance of CPUs and GPUs. Equipped with Supermicro's advanced I/O Module (AIOM) expansion slots for fast and flexible networking capabilities, the system can also process massive data flow for demanding AI/ML applications, deep learning training, and inferencing while securing the workload and learning models. It is also ideal for multi-instance high-end cloud gaming and many other compute-intensive VDI applications. In addition, Virtual Content Delivery Networks (vCDNs) will be able to satisfy increasing demands for streaming services. Power supply redundancy is built-in, as either node can use the adjacent node's power supply in the event of a failure.
About Super Micro Computer, Inc.
Supermicro (SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced Server Building Block Solutions for Enterprise Data Center, Cloud Computing, Artificial Intelligence, and Edge Computing Systems worldwide. Supermicro is committed to protecting the environment through its "We Keep IT Green" initiative and provides customers with the most energy-efficient, environmentally-friendly solutions available on the market.
Supermicro, Server Building Block Solutions, and We Keep IT Green are trademarks and/or registered trademarks of Super Micro Computer, Inc.
Intel, the Intel logo, and other Intel marks are trademarks of Intel Corporation or its subsidiaries.
All other brands, names, and trademarks are the property of their respective owners.
SOURCE Super Micro Computer, Inc.
Continue reading here:
Supermicro Expands GPU System Portfolio with Innovative New Servers to Accelerate a Wide Range of AI, HPC, and Cloud Workloads - PRNewswire
Bare-faced metal-weaving cheek Dell pachyderm moves into RobinIO’s 5G telco room Blocks and Files – Blocks and Files
A great big Dell elephant has invaded Robin.ios telco market for Kubernetes-based orchestration with its Bare Metal Orchestrator.
Robin.io has grown its Kubernetes storage facility into a complete 5G system roll-out orchestrator for telcos, putting great vertical market distance between it and other Kubernetes storage providers such as MayaData, Pures Portworx, Red Hat OpenShift, NetApps Astra, OnDat and others.
A statement from Dennis Hoffman, SBP and GM of Dells Telecom Systems Business, said: Bare Metal Orchestrator gives communication services providers an easier way to deploy and manage open network infrastructure while saving costs and time, allowing them to focus on delivering new and differentiated services to their customers.
Back in June, Dell said it was anchoring an open, cloud-native telecom ecosystem with infrastructure and solutions, industry partners and an Open Telecom Ecosystem Lab. Dell would launch a cloud-native network infrastructure with a full stack of open, scalable carrier-grade server and software offerings to help telcos provide 5G facilities at the edge.
This involved Project Metalweaver software enabling CSPs to select, autonomously deploy and manage thousands of multi-vendor compute, network and storage devices across multiple locations. It also includes reference architectures to span telecommunications in edge, core and Open RAN environments.
CSPs would be offered deployment possibilitiesbased on Dell hardware and software, the VMware Telco Cloud Platform and Red Hat OpenShift, and using:
Mavenir is an end-to-end, cloud-native network software provider,and a Robin.io competitor.
Dells Bare Metal Orchestrator is the first software to come from Project Metalweaver and gives CSPs tools to discover and inventory networked servers, bring them online and deploy software in them. It uses so-called declarative automation to tell these serversto deploy software stacks and workloads. This can, Dell says, eliminate days or weeks of configuration and provisioning to bring network hardware into a workload-ready state in an Open RAN environment.
Dell also announced a validated Solution for Mavenir Open vRAN and VMware Telco Cloud Platform, a reference architecture for Wind River Studio, and Respond and Restore for Telecom support services.
We are seeing the development of automated provisioning of potentially hundreds of thousands of embedded server/networking/cloud-native software systems at 5G edge locations with lifecycle management built in. Storage, inevitably local and direct-attached storage, is a small but necessary part of this.
Dell is going to use its IT industry status, worldwide supply and support capabilities, along with specialist technology partners such as Mavenir to blast its way into this market as a major player from day one.
Robin.io is going to have to learn how to tap dance around Michael Dells marauding pachyderm. Maybe it needs to ally with some other server supplier someone like HPE or Lenovo.
Human hacking increased as apps and browsers moved completely to the cloud – Help Net Security
Human hacking phishing attacks across all digital channels has dramatically increased in 2021. SlashNext released its first report showing a 51% increase in attacks compared to 2020, and increasingly these attacks are happening outside of email.
The cybersecurity industry has done a good job of protecting machines, but those efforts leave the most porous and vulnerable parts of any network the humans using it unprotected, said Patrick Harr, SlashNext CEO.
Todays hyper-targeted spear phishing attacks, coming at users from all digital channels, are simply not discernable to the human eye. Add to that the increasing number of attacks coming from legitimate infrastructure, and the reason phishing is the number one thing leading to disruptive ransomware attacks is obvious.
Apps and browsers are used as humans connect with work, family, and friends. Cybercriminals are taking advantage of this by attacking outside of email and taking advantage of less protected channels like SMS text, social media, gaming, collaboration tools, and search apps.
Spear phishing and human hacking from legitimate infrastructure increased in August 2021, 12% (or 79,300) of all malicious URLs identified came from legitimate cloud infrastructure like including AWS, Azure, outlook.com, and sharepoint.com enabling cybercriminals the opportunity to easily evade current detection technologies.
There was also a 51% increase in phishing in 2021 compared to 2020. That is on top of triple-digit growth in attacks in 2020 over the previous year.
In July 2021, more than one million malicious URLs were identified across all digital channels. A large percentage of those attempted attacks were targeted at those trying to access Olympics streaming sites.
Attacks have moved from email to unprotected channels including SMS text, social media, and more. The LinkedIn data breach made over one billion records available to cybercriminals and increasingly spear phishing efforts are using that data to attack high-value targets.
Of the more than 14 million malicious URLs identified to date in 2021, 51% were credential stealing attempts. Cybercriminals use those to gain access to networks which is why spear phishing is behind 91% of all successful cyber breaches including ransomware attacks, data theft, and over $30 billion of financial fraud.
Social engineering attacks have grown quickly in 2020 growing to 40% of all attacks from 6% in 2020. This is because cybercriminals are increasingly shifting from email phishing to SMS, social, and web-based threats.
The shifting phishing landscape, combined with cybercriminals being enabled with automation and access to data and intelligence, has quickly made human hacking the number one cyberthreat. Previous security strategies, including secure email gateways, firewalls, and proxy servers are no longer stopping threats, especially as they move beyond email.
Security training and human intervention are not practical solutions to stop the threats because the level of sophistication makes most attacks either not discernible to the human eye or engineered enough to draw in the most informed person.
See the article here:
Human hacking increased as apps and browsers moved completely to the cloud - Help Net Security
Cloud Native Computing Foundation Welcomes Record Number of New Silver Members – UpperMichigansSource.com
98 new members joined CNCF this quarter, driving an increasingly diverse ecosystem that continues to provide leading-edge technology innovations across the globe
Published: Oct. 13, 2021 at 12:00 PM EDT
LOS ANGELES, Oct. 13, 2021 /PRNewswire/ --KubeCon + CloudNativeCon North America-- The Cloud Native Computing Foundation(CNCF), which builds sustainable ecosystems for cloud native software, today announced the addition of 98 new Silver members, including Cardinal Health, Niantic, Robinhood, and Trend Micro to continue the rapid innovation in open source and cloud native communities across the world.
As the cloud native community prepares for its first in-person event since the start of this pandemic, a record-breaking number of increasingly diverse organizations have joined CNCF. Now with 725 members, CNCF has grown almost 40% since the beginning of 2020, showcasing the need for innovative technologies to respond to current and future challenges.
"We are thrilled to see that cloud native technology continues to resonate with so many organizations across industries and geographies," said Priyanka Sharma, general manager of the Cloud Native Computing Foundation. "We are welcoming more new members than we have ever added in a single quarter, a testament to the resilience of our community. CNCF is proud to welcome a diverse range of Silver members to #TeamCloudNative and we look forward to working together as we usher in the next era of cloud native."
About the newest Silver Members:
About the newest End User Member:
About the newest Nonprofit Member:
About the newest End User Supporter:
With the addition of these new members, there are now over 140 organizations in the CNCF End User Community. This group regularly meets to share adoption best practices and feedback on project roadmaps and future projects for CNCF technical leaders to consider.
Additional Resources
About Cloud Native Computing Foundation
Cloud native computing empowers organizations to build and run scalable applications with an open source software stack in public, private, and hybrid clouds. The Cloud Native Computing Foundation (CNCF) hosts critical components of the global technology infrastructure, including Kubernetes, Prometheus, and Envoy. CNCF brings together the industry's top developers, end users, and vendors, and runs the largest open source developer conferences in the world. Supported by more than 500 members, including the world's largest cloud computing and software companies, as well as over 200 innovative startups, CNCF is part of the nonprofit Linux Foundation. For more information, please visit http://www.cncf.io.
The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our trademarkusage page. Linux is a registered trademark of Linus Torvalds.
Media ContactJessie Adams-ShoreThe Linux FoundationPR@CNCF.io
View original content to download multimedia:
SOURCE Cloud Native Computing Foundation
The above press release was provided courtesy of PRNewswire. The views, opinions and statements in the press release are not endorsed by Gray Media Group nor do they necessarily state or reflect those of Gray Media Group, Inc.
See the article here:
Cloud Native Computing Foundation Welcomes Record Number of New Silver Members - UpperMichigansSource.com
The global cloud advertising market size is expected to grow at a Compound Annual Growth Rate (CAGR) of 19.6% during the forecast period, to reach USD…
Marketing has evolved to a great extent in the past decade; new forms of marketing have taken over with continuously upgrading tools. Marketers can target the specific customer they want from the comfort of their homes.
New York, Oct. 12, 2021 (GLOBE NEWSWIRE) -- Reportlinker.com announces the release of the report "Cloud Advertising Market with COVID-19 Impact, by Component, Application, Organization Size, Deployment Model, Vertical And Region - Global Forecast to 2026" - https://www.reportlinker.com/p06175236/?utm_source=GNW Outdoor marketing is no longer the only medium to reach the targeted audience; nowadays, marketers can market their products and services to the target audience they like. Different forms of marketing can help end users reach the exact kind of customer they want. Different types of marketing, such as social media marketing, email marketing, etc., help end users analyze the target audience. Data analytics provide marketers accurate details of their target audience so that advertising can be optimized and lead to efficient results. This increasing demand for targeted marketing and consumer analytics bolsters the growth of the cloud advertising market.
The platform segment is expected to hold a larger market size during the forecast period.The cloud advertising market, by component, covers platforms and services.The platforms segment is projected to hold the largest market share during the forecast period.
Cloud advertising helps companies create, manage, and execute customer engagement to drive revenue growth across industries.Platforms allow organizations to create personalized and automated web-based marketing campaigns across touchpoints, such as email, social media, web, and mobiles.
They manage repetitive tasks associated with campaign follow-ups and send one-pagers and emails at regular intervals.They also provide integrated data, build customer profiles, make instant content delivery and reporting, and help collaborate with other team members.
Platforms have a complete set of marketing tools, such as omnichannel campaign management, content management, data management, testing and personalization, and analytics. They offer capabilities such as AI technology to analyze consumer data in real-time and modify the content to be displayed. Cloud advertising uses data science and machine learning algorithms to boost conversion, visitor engagement, and loyalty.
The public cloud segment is expected to hold the largest market share in 2020
The deployment model segment includes public cloud and private cloud.The public cloud segment holds the largest market share in 2021.Public cloud is built on the cloud computing model, which shares resources (such as CPUs, servers, and racks) among various businesses, depending on the demand.In the public deployment model, various resources, such as applications, storage, virtual servers, and hardware, are available to client enterprises over the internet.
The services offered over the public deployment model are either free or subscription-based.The advantages of using the public cloud include simplicity and ease of deployment.
Moreover, the initial investment required for this deployment is minimum, and there are no responsibilities involved in managing the infrastructure. The public cloud offers various benefits to organizations; these include scalability, reliability, flexibility, utility-style costing, and location independence services. The major concern about the public cloud is data security, and due to this reason, several enterprises are moving toward private and hybrid cloud models for their cloud operations.
North America to hold the largest market size and Asia Pacific (APAC) to grow at a higher rate during the forecast periodThe geographic analysis of the cloud advertising market includes five major regions: North America, Europe, APAC, MEA, and Latin America.Among all regions, North America is estimated to hold the largest market size in 2021, and the trend is expected to continue till 2026.
The region is expected to hold the largest market size, as the cloud marketing technology already has a strong presence here.It constitutes developed economies: the US and Canada.
The early adoption of digital marketing, the presence of top players, and globalization of cloud services in North America are expected to drive the cloud advertising market.Enterprises in the region are investing heavily in digital marketing initiatives and improving customer experience.
In North America, the percentage of social media users, smartphone users, and ad spending is exceptionally high compared to other regions. The penetration of mobile devices in the US is more than 90%, followed by Canada. It provides marketers with a strong channel to target potential customers. Verticals, including consumer goods and retail, and media and entertainment, are expected to invest in cloud marketing technology. The growing demand for personalized content and experience would further drive the adoption in North America.Vendors have adopted several organic and inorganic growth strategies, such as new product launches and enhancements, partnerships and collaborations, and mergers and acquisitions, to strengthen their presence in the cloud advertising market.They have been adopting a combination of organic and inorganic growth strategies to expand their customer base and market share.
Service enhancements to meet dynamic market needs and partnerships have been the active strategies implemented by the major vendors in recent years.In the process of determining and verifying the market size for several segments and subsegments gathered through secondary research, extensive primary interviews were conducted with the key people.
The breakup of the profiles of the primary participants is as follows: By Company Type: Tier I: 25%, Tier II: 25%, and Tier III: 50% By Designation: C-Level: 35%, D-Level: 25%, and Others: 40% By Region: North America: 48%, Europe: 27%, APAC: 15%, and RoW: 10%
The report profiles the following key vendors:1. Adobe (US)2. Oracle (US)3. Salesforce (US)4. Google (US)5. IBM (US)6. SAP (Germany)7. Acquia (US)8. Demandbase (US)9. Experian (US)10. Kubient (US)11. FICO (US)12. HubSpot (US)13. Imagine Communications (US)14. InMobi (India)15. Marin Software (US)16. Sitecore (US)17. MediaMath (US)18. Nielsen (US)19. PEGA (US)20. Sailthru (US).
Research CoverageThe report segments the global cloud advertising market by component, Platforms and Services.By application, the cloud advertising market has been segmented into campaign management, customer management, experience management, analytics and insights, and real-time engagement.
Based on organization size, the cloud advertising market has been classified into large enterprises and SMEs.Based on deployment model, the market has been classified into public cloud and private cloud.
By vertical, the cloud advertising market has been classified into Retail and Consumer Goods; BFSI; Education; Travel and Hospitality; Telecommunications; Manufacturing; Media and Entertainment and others. By region, the market has been segmented into North America, Europe, APAC, MEA, and Latin America.
Key Benefits of Buying the ReportThe cloud advertising market report will help the market leaders/new entrants in the market with information on the closest approximations of the revenue numbers for the overall cloud advertising market and the subsegments.The report will help stakeholders understand the competitive landscape and gain more insights to better position their businesses and to plan suitable go-to-market strategies.
The report also helps stakeholders understand the pulse of the market and provides them with information on key market drivers, restraints, challenges, and opportunities.Read the full report: https://www.reportlinker.com/p06175236/?utm_source=GNW
About ReportlinkerReportLinker is an award-winning market research solution. Reportlinker finds and organizes the latest industry data so you get all the market research you need - instantly, in one place.
__________________________
Story continues
See the original post here:
The global cloud advertising market size is expected to grow at a Compound Annual Growth Rate (CAGR) of 19.6% during the forecast period, to reach USD...
Google and Dell offer new tools to help operators manage 5G and the edge – TechRadar
Dell and Google have both detailed new offerings for mobile operators looking to roll out new 5G services, in further evidence of the growing convergence between the technology and telecoms sectors.
Google has taken the wraps off Google Distributed Cloud, a portfolio of hardware and software services that bring Googles infrastructure to the network edge or local data centre.
The most revolutionary of 5G applications such as Industry 4.0, virtual reality and some AI workloads that rely on real time data - will require ultra-low latency that is simply impossible to achieve using a traditional cloud structure and a centralised data centre.
Edge computing allows data to be processed as close as possible to the point of collection and operators are rearchitecting their networks away from a centralized core layer accordingly. Many providers now have multiple data centers distributed across their footprint while even base stations can be used as edge sites.
Google Distributed Cloud allows operators to run core 5G and RAN functions at their edge sites, alongside enterprise applications and Google services, to support latency-sensitive applications that open new revenue streams and maximize investments in 5G infrastructure.
Its also possible to run Google Distributed Cloud at one of Googles own edge locations around the world, at a customer edge site, or in a customer data centre.
Dell Technologies is also taking a keen interest in the telco sector and has unveiled a range of services that automate the deployment and management of cloud-native infrastructure essential for maximizing 5G opportunities.
Specifically, it says the shift to the edge is one of the reasons operators are adopting OpenRAN a vendor-neutral approach to radio technology that allows operators to mix and match products. However, it believes the task of managing such a complex environment comprising multiple sites, vendors and geographies is a huge challenge.
Dells new Bare Metal Orchestrator makes it possible to manage hundreds of thousands of servers across the globe and saves days and weeks of configuration and provisioning. For a carrier looking to bring a new service to market as quickly as possible and react to changing market trends, thats a lot of time being served.
Meanwhile, the company is also expanding its partner ecosystem with new services and reference architectures with Mavenir and Wind River the latest to jump on board.
As server technology proliferates through increasingly open telecom networks, the industry sees an immediate and growing need for remote lifecycle management of a highly distributed compute fabric, said Dennis Hoffman, GM, Dell Technologies Telecom Systems Business.
Bare Metal Orchestrator gives communication services providers an easier way to deploy and manage open network infrastructure while saving costs and time, allowing them to focus on delivering new and differentiated services to their customers.
Original post:
Google and Dell offer new tools to help operators manage 5G and the edge - TechRadar
Azure Emissions Dashboard shows how you and Microsoft are slowly killing the planet with your cloud workloads – The Register
Microsoft has made its Emissions Impact Dashboard - formerly known as Sustainability Calculator and designed to measure the carbon impact of cloud workloads - generally available.
If that sounds familiar, it may be because Google this week made a big deal of its Carbon Footprint preview, which claims to "measure, report and reduce your cloud carbon emissions."
Corporations are under pressure to do something about sustainability and the ability to report on carbon usage is a starting point. Hence Microsoft's post referencing customers like the Bhler Group, which "saw the need to track Scopes 1, 2, and 3 emissions" and can now use the Dashboard to do so.
Emission scopes: Scope 3 is big but hard to measure
What are these scopes? All is explained in this white paper, which says that Scope 1 is direct emissions such as from fuel for backup power generators, Scope 2 is emissions from energy consumed, primarily electricity, and Scope 3 is indirect emissions including such things as manufacturing and delivering servers and racks.
"For many organizations (apart from certain direct-emission heavy sectors such as manufacturing and energy), their Scope 3 emissions are much larger than their Scope 1 and 2 emissions," the paper states.
These scopes are part of the Greenhouse Gas (GHG) Protocol, an international team which develops standards for quantifying emissions, created by the World Resources Initiative and the World Business Council for Sustainable Development. Microsoft is listed as one of the companies providing funding for the GHG Protocol but Google is not.
What does Google's Carbon Footprint measure? According to the methodology, it covers "Google Cloud product electricity use" and "upstream emissions associated with electricity generation facilities and fuels, for most of the electricity load." It "does not include direct emissions or indirect emissions related to the value chain (respectively Google's Scope 1 and Scope 3 in the Greenhouse Gas (GHG) Protocol carbon reporting standards)," the docs say. The likely conclusion is that the Carbon Footprint dashboard is not an accurate measure of the actual carbon footprint of GCP usage, whereas Microsoft is at least having a stab at it.
Microsoft's Emission Impact Dashboard
That said, there are a few problems with Microsoft's approach. One is that whereas Google's Carbon Footprint is simply part of the GCP console, Microsoft's Emissions Impact Dashboard is a Power BI Pro application that requires setup and cost. "The Emissions Impact Dashboard runs on Power BI Pro. Get the Power BI Pro free trial," say the setup docs.
Scope 3 impact is hard to measure reliably and in the end measuring the power consumed by cloud resources is perhaps the thing of most value when it comes to cloud dashboards like this. There is also an argument that public cloud is inherently more efficient, thanks to shared resources, than on-premises computing, even if it may also be more expensive.
One thing that is certain is that reducing unnecessary consumption of computing resources will also benefit sustainability. Given the way that all these public cloud providers push customers into using an increasing amount of premium and resilient resources, and reward customers for committing to long-term usage in advance, we should be allowed some scepticism with regard to these dashboards though it is also true that the efforts made to make their data centres more sustainable have real value, given the scale of their usage.
Read the original post:
Azure Emissions Dashboard shows how you and Microsoft are slowly killing the planet with your cloud workloads - The Register
Cybersecurity practitioners are convinced they need AI now comes the hard part – SecurityBrief Australia
Article by LogRhythm VP for sales APAC Simon Howe.
Like other domains, cybersecurity has been eyeing the promise of artificial intelligence (AI) and machine learning (ML) algorithms for some time.
Even back in 2019, nearly two-thirds of respondents to a Capgemini survey thought AI would help identify critical threats. Additionally, 73% of Australian organisations believed they would eventually be unable to respond to attacks without the assistance of AI.
Analysis by PwC since then confirms that AI is making organisations more resilient to cybercrime.
Theres been a steady movement for certain cybersecurity functions to harness AI/ML since then. It is now particularly prevalent in malware detection and increasingly for automating analysis and decision-making in data-intensive fields like incident detection and response.
But cybersecurity is also a particularly challenging space for AI/ML use. Its fast-moving, and the stakes are high. Algorithms put to work in security need themselves to be trustworthy and secure, and this in itself is not easy to solve.
Through our work in this space over the past seven-plus years, some ground rules have emerged. In particular, some keys to success in this space include clean data, a good business case, and the ongoing involvement of domain and technical experts.
Getting these right will solve many of the current challenges and bring cybersecurity functions closer to the AI-augmented operating model they need.
The deal on data
Data is obviously a critical input into AI and machine learning algorithms, both initially as those models are being trained, and then on an ongoing basis when the models are put into production and are expected to constantly get better at what they do.
Most organisations dont have clean, unified and consistent data to feed a model from the outset, and so many AI/ML projects begin with a period of data preparation before any of the actual AI work can proceed.
This has been understood for some time. The Capgemini survey, for example, noted that buying or building a data platform to provide a consolidated view of data should be a first step for organisations that want to use AI in cybersecurity successfully. Still, that probably understates the level of effort involved.
In the cybersecurity world, its important to collect data from a range of different sources from antivirus, firewalls, databases, cloud, servers and end-user devices because thats ultimately what will enrich data science and models for AI.
That data must then be treated: processed, analysed, and pulled into a format that can be consumed by the model.
Consistency matters at this stage, and in all likelihood, most organisations wont have it. Not all logs contain the same metadata fields. Not all vendors even define log fields in the same manner.
Organisations will need to establish consistent definitions and contextualise the metadata present in their logs to understand what to train the model on and what it should be looking for.
Just add domain expertise
As mentioned in the previous section, context raises another important point about AI/ML adoption in cybersecurity.
Domain expertise is a crucial input. The first step in an AI/ML project is to get a domain expert to define what is happening in the environment that gives rise to the need for algorithmic assistance. The need may stem from a missed or bad detection of a threat, or from the existence of too many false positives. Whatever it is, it needs to be written down.
If AI/ML is considered the best way to address this challenge and a model is then developed, domain expertise will still be needed while the model finds its feet. The expert understands what is normal and what is unusual, even if the model doesnt yet.
For example, in the days before mass remote work, it was easier to define what normal and unusual might look like in the traffic logs and patterns of a workday. Large-scale work-from-home introduced considerable unpredictability and complexity. A model will not be able to detect some of this nuance immediately, and oversight will be required.
A domain expert will be able to look at a models output and understand what data is missing or how the model is trying to detect a cybersecurity threat. This is important in understanding how to then tune the model.
That need for tuning will be ongoing. Attack tactics and techniques are constantly changing, so organisations that employ AI/ML for cybersecurity have to constantly look at whats happening in the field to understand how that impacts the AI model and its detection ability.
See the original post:
Cybersecurity practitioners are convinced they need AI now comes the hard part - SecurityBrief Australia
Qualcomm Fires Shots At Google Over Its Tensor SoC – Android Headlines
With the Pixel 6 release just around the corner, Qualcomm has decided to take a swipe at Google for foregoing its Snapdragon SoC with the new flagship. As most of you know, Google developed a new chipset known as Google Tensor for the Pixel 6 series.
Clearly, this isnt going down well with Qualcomm as evidenced by a tweet posted by the company. The tweet reads (via) Weve decided to make our own smartphone SoC instead of using Snapdragon, followed by a series of red flags.
Red flags are a relatively new phenomenon on the internet. Users make a statement or comment with sarcastic overtones as a critique of something. To make it apparent, they add a bunch of red flag emojis to the message.
With that out of the way, Qualcomm wants us to think that Googles Tensor chip is a bad idea. Although the company doesnt name Google specifically, it doesnt take a genius to understand what the chip maker is trying to say.
Qualcomm is understandably frustrated about all the potential good press Googles new chip could receive. Meanwhile, Samsung has its own line of chipsets under the Exynos moniker, though the company also utilizes Snapdragon SoCs for its devices. So Qualcomms irritation is understandable, though it comes across as being overly sour.
For Google, developing its own chipset makes a great deal of sense. Thanks to Tensor, Google will have more control over the software features it wants to implement. It is the end of an era for Qualcomm, though, since it has been involved with every Pixel phone launched so far.
Its unclear how the Pixel 6 and 6 Pro would fare with Google Tensor, though theres a general sense of positivity surrounding the new flagships. Fortunately, we dont have to wait long to find out as Google will unveil the two flagships in less than a week from now.
Qualcomm will continue to churn out chipsets for Android smartphones in the near future. However, losing a major client like Google can be disappointing for the company.
Developing a smartphone chipset is a first for Google. However, the company designs its own chips for Google Cloud servers, so its not new to this field. Previous reports have suggested that Samsung will mass-produce the Tensor chipset for Google.
Read more here:
Qualcomm Fires Shots At Google Over Its Tensor SoC - Android Headlines
Restore Data in Cloud Computing: The Best Option for You – MarylandReporter.com – MarylandReporter.com
If you are planning to invest in cloud computing or are already a cloud customer, you probably know the benefits of data backup like efficiency, high availability, high accessibility, and elasticity. However, data loss due to natural disasters like earthquakes, floods, and fires can also be effectively addressed by cloud computing. Natural disasters are not only limited to damages caused by storms, lightning, and other external factors. Damage caused by viruses and malware like hacker attacks can also be effectively addressed through cloud computing. Therefore, you should also consider the benefit of data security in cloud computing.
When data is lost due to natural causes, you can call in experienced professionals from cloud computing services that will retrieve the lost data. With cloud security measures, you can rest assured that your data will be protected even if the disaster zones strike all over again. Cloud-based services ensure the safety of your data by implementing multiple layers of security and protection procedures. With its several advantages, cloud computing is becoming more popular.
You can easily make use of cloud storage for your data, especially when disaster strikes. The best thing about cloud computing is its cost-effectiveness. You can get a large amount of storage space without paying a single cent for your data backup. This means that you can enjoy unlimited access to various applications and can run your business without any hitches.
Another advantage of the cloud is that it provides application security better than a traditional hosting environment. You need not worry about application security when you host your data in the cloud. Application security in SAAS is better than that in a traditional site because of the lack of physical hardware to provide protection. In a traditional site, you may want to install strong physical security measures such as firewalls and security cameras. But this cannot be done with SAAS.
You also do not have to worry about data reliability. Disaster recovery from the cloud can be done successfully. You do not have to wait for a disaster to restore data in the cloud. As long as you have an active internet connection, you can restore data in the cloud pretty quickly.
You also gain several other benefits such as improved user accessibility, easy collaboration, reduced IT costs, reduced vendor lock-ins, easier collaboration, and enhanced functionality. These are just some of the benefits you stand to enjoy when you implement the use of the community cloud. The biggest benefit is, of course, cost savings. When you use the public cloud, you pay for usage just like you would for a local server. You dont have to buy any hardware, manage any servers, manage any application servers or pay any licensing fees. All of these costs are eliminated when you use the community cloud.
With the mainframe server offering you a choice, you need to choose whether you want to convert your physical server into a virtual machine or convert it into a cloud computing virtual machine. Both solutions have their pros and cons. Converting the physical server to a virtual machine offers you greater flexibility but at a higher price. Converting it to a cloud computing virtual machine offers you ease of use with reduced costs, greater capacity, and better performance.
Read more from the original source:
Restore Data in Cloud Computing: The Best Option for You - MarylandReporter.com - MarylandReporter.com