Page 2,533«..1020..2,5322,5332,5342,535..2,5402,550..»

Cartwright and Alhambra school districts soon to deliver free Wi-Fi – ABC15 Arizona

PHOENIX Nearly $34 million in CARES Act money is going to be used to provide free Wi-Fi to students from kindergarten through community college. Its part of an effort to even the playing field for families who just cant afford internet at home. The problem is far from new but the solutions to bridging the digital divide at Phoenix-area schools is.

Its about making access available to students, its about removing inequities, said Paul Ross with Phoenix College.

Ross is the brainchild behind the Phoenix Digital Education Connection Canopy, an idea put into motion after schools serving high poverty populations struggled to transition to online learning during the pandemic. On September 1, the canopy will begin delivering free Wi-Fi to potentially thousands of students.

It brought awareness to where the gaps really existed, for a lot of people it went from being a number on a spreadsheet to being weve got real households, students who really dont have access at home, said Ross.

Hes talking about students like Greg Arzola, now studying cyber security at Phoenix College, and helping to make this program a reality alongside Ross as part of his internship. But in high school, his family struggled to make ends meet while his single mother cared for seven children.

It was just really my mom taking care of us and we didnt have a lot of extra money to spend on internet, its kind of like either paying the rent or just having internet access, said Arzola who said he didn't have internet access at home in high school.

The solution looks like this, infrastructure for Wi-Fi access is now installed across seven locations, casting a four-square-mile net providing internet coverage for both the Cartwright and Alhambra school districts.

With this, I know that children will have more benefits, better opportunities than I did, and have a better time succeeding in school and getting a better education, said Arzola.

Thats going to benefit our students for years and years to come, said Cartwright School District CFO Victoria Farrar.

Farrar says the network comes with security measures to protect children using it, giving one of the poorest districts a hand up towards a brighter future.

This is really laying the foundation for what we know our students need, and now we have the tools and ability to make sure we sustain it going forward, said Farrar.

Its a partnership of schools, the City of Phoenix, and Phoenix College that will eventually add the connection canopy over 13 area districts in the next few years.

Follow this link:
Cartwright and Alhambra school districts soon to deliver free Wi-Fi - ABC15 Arizona

Read More..

The lies of free sign-ups – The Kathmandu Post

When I say browsing through any website isnt free, I dont mean the price you pay to your internet service providers or the price you pay for your electricity bill. It's not even the price you pay to get hold of the electronic devices to access such accounts; instead, it's your datayour private informationwhich is sold; auctioned off to the highest bidder. Over and over again. To quote an Internet user named bluebeetle If you are not paying for it, youre not the customer; youre the product being sold.

In the infamous hearing of Facebooks Chief Executive Mark Zuckerberg in 2018 when Senator Hatch had asked him, "How do you sustain a business model in which users don't pay for your service?" Zuckerberg replied, "Senator, we run ads." So, for the platforms which heavily rely on their users' watch time and click-through rate on ads to earn money, it is only logical for their business model to focus on what their advertisers want and how to market relevant ads to the users. Hence they use targeted ads. Targeted advertisements are done by marketers where the users get ads that revolve around their specific interests, traits and shopping patterns.

The websites run their ads specific to their users to benefit their "customers", who are the companies who buy the advertisement slots. In the Netherlands, a 2013 study showed that when a law was introduced that required websites to inform visitors of tracking in the advertisements, click-through rates dropped. So it is obvious why companies would use sneaky ways and abuse loopholes in the law to mine data.

With users sharing their personal data and the web cookies tracking every click of the users, the marketers have been able to tailor ads to each user according to their needs. Research shows that many people don't know that their data dictates the ads they receive.

Most of us have got ads of the products or services specifically when we need them. Researchers discovered that users perceive personalised ad content as more appealing and more connected to their interests.

Have you ever wondered how much information Facebook or Youtube, or Instagram has on you? How much information has Google stored on you and to what extent it keeps track of your search history and click-through rates? Well, you can request a copy of the data these websites have on you. When I got curious about how much Google has tabs about my personal information, I exported my personal data from Google. It created a copy out of 46 products that contained 39.25 GB worth of data. Your privacy settings determine how much information you allow Google to access your browsing history and activity on related products.

Google keeps track and stores your location. Google has a record of every place you've been to (if your location tracking was turned on). I was shocked to find Google still keeping records of a random restaurant I visited on July 31st, 2015. It stores your search history across all of your devices, even the ones you have deleted. It knows all the apps youve used, every extension used. It has all of your YouTube historylikes, comments, searches and subscriptions. So, based on the content you watch on YouTube, Google can figure out your personality, political inclination, religious stance, health data, and tastes and preferences on basically anything. Google Photos has access to all the photos you've taken through your phone. Much like Google, Facebook, too, keeps track of every message you have sent and people you've befriended or unfriended. It also keeps track of your log-ins or log-outs, the devices you have used, and the places you have visited. Even if you delete any piece of information, it just becomes invisible but never really disappears.

Inspired by Brian X. Chens article in the New York Times, I downloaded the information that Facebook has on me. To my horror, I found out that they had 3.46 GB worth of data on me. I found a folder labelled "Ads_information. A section named Advertisers who uploaded a contact list with your information had an overwhelming majority of companies I had never heard of or interacted with. It also had an "Advertisers you've interacted with" folder that records every advertisement I've interacted with.

Chen further explains how brands obtain users' information. These include: Buying information from data providers like Acxiom and taking that information to Facebook to serve targeted ads. Brands use tracking technologies like web cookies and invisible pixels to collect information about your browsing activities. According to Ghostery, Facebook offers different trackers to help brands harvest your information, advertisers can take some pieces of data that they have collected with trackers and upload them into the "Custom Audiences" tool to display ads to you on Facebook. After receiving a backlash, Facebook has limited the practice of allowing advertisers to target ads using information from third-party data brokers.

Sometimes ignorance is bliss but not when its your data that is in danger. The free services that these companies provide us doesnt automatically mean were getting fair compensation in exchange for our data. As MIT Technology Review has put it, "have little idea how much personal data they have provided, how it is used, and what it is worth." If the general public were aware of viable alternatives, they might hold out for compensation for free.

In the same internet space where browsers like Gener8 Ads respect your choice to either limit your data collection or generate money from it, we are obligated to analyse whether Google has been selling our data in exchange for providing "free" services. In a Harvard Business Review, Maurice E. Stucke wrote how Data-opolies have been depressing privacy protection below competition levels and collecting personal data above competition levels. (The Data-opolies consist of Google, Facebook, Amazon and similar companies who have minimal competition.) Stucke compares the collection of excessive personal data with charging an exorbitant price for a product/service. Since the companies have limited - or no- competition, they without a doubt have no competitive alternatives hence the bargaining power for the users is nonexistent.

These companies are also at a considerable risk of getting a security breach as the hackers have more incentives to hack such companies. The personal data of over 533 million Facebook users was leaked online in April 2021. This exposed the users' data for free, leaving the users vulnerable to data theft or impersonation.

When there should be stronger regulations on internet security and data privacy, Data awareness amongst users is necessary. So maybe next time you sign up for a website, you first at least skim-read the "Terms of Service", and the next time you click on "Accept Cookies", you first read what data you're willing to let the website track and collect from your browser.

See the rest here:
The lies of free sign-ups - The Kathmandu Post

Read More..

Why Kubernetes isnt just another tech buzzword – IT Brief New Zealand

Article by New Relic ANZ solutions consulting senior director Myk Shaforostov and Innablr CTO Prateek Nayak.

Kubernetes, or K8s, the popular container orchestration platform, has profoundly transformed the way development teams deploy software, and for good reason.

Its rapidly becoming the source of truth for many organisations due to its centralised platform structure and has numerous benefits, including increased deployment agility, cost savings and scalability.

In fact, new research has found that 68% of IT professionals increased their K8s usage due to the COVID-19 pandemic. The benefits of K8s are being realised by tier-one enterprises to smaller-scale business operators and everything in between. So why should businesses get on the K8s bandwagon?

K8s is an open-source container orchestration platform designed to automate the deployment, scaling and management of containerised applications. K8s comes with many benefits, including effective resource consumption control, easy canary deployments and rollbacks, and easy scalability.

Initially developed by Google, K8s today is the de facto standard for container orchestration and the flagship project of the Cloud Native Computing Foundation (CNCF).

While K8s itself sits in the developer and engineering realm, it has notable flow-on effects on the wider business, especially in terms of operational efficiencies and logistical concerns, both of which impact the organisations bottom line.

Distributed teams and security best practices

One of the biggest concerns companies tend to have around K8s and indeed open-source technology is security. Traditionally when businesses are on-premises, they are isolated behind the network, so security isnt so much of an issue. However, for companies that operate in the cloud with distributed teams, that boundary doesnt exist. When teams are in disparate locations deploying their own VMs, tracking the security posture is challenging.

One Melbourne-based K8s Certified Service provider (CNCF) with a focus on cloud engineering and next-generation platform consultancy explained that the open-source community has built an ecosystem of robust software to bolster K8s security landscape. And because K8s promotes the centralising of infrastructure, security oversight is simplified, and security mechanisms can be bolted on with ease.

With K8s, there are many controls such as Centre for Internet Security (CIS) and National Institute for Standards of Technology (NIST) guidelines that outline how to secure K8s infrastructure, plus tools that provide necessary automation to benchmark clusters.

Reports alerting users to potential security risks can be easily run and is powerful for security personnel because it gives them one place to focus on for all the teams in the business. Furthermore, when security modifications or enhancements are made, everybody in the team benefits from it.

This can be characterised as localised improvements that allow for global benefits. By making one local improvement to a K8s cluster, every team that is deploying onto that cluster benefits from it and inherits the improvements instead of having to implement them themselves.

Localised improvements that allow for global benefits

The benefits of K8s extend beyond security. Business engineers no longer need to spend large amounts of time implementing security controls and going through weeks of checks and balances. Engineers are free to focus purely on writing features or enhancements for customers and react faster to market changes.

The pandemic is a prime example. Businesses that were already well into their K8s journey could react much faster to the switch to digital than those working in on-prem environments. The agility benefits are an inherited part of a central platform like K8s.

Solutions for businesses big and small

Its not just smaller, more nimble businesses that are reaping the benefits of K8s. Some of the biggest names in Australias banking and insurance industries have been on their K8s adoption journey for well over two to three years. What these businesses find most compelling is the centralised nature, flexibility, and scale that the platform offers. It also works with all the cloud providers, with on-prem systems, and offers developers a straightforward interface to deploy applications.

Even given the platforms success to date, K8s is still early in its journey towards achieving its full potential, which makes it all the more exciting.

Kubernetes is here to stay.

Read more here:
Why Kubernetes isnt just another tech buzzword - IT Brief New Zealand

Read More..

Protect Your Privacy For Life With This VPN Loved By Its Users, Now Less Than $40 – IGN Nordics

If you use the internet, you need a solid VPN. It keeps your private data away from prying eyes, makes even public Wi-Fi safe to use, and allows you to browse the internet totally unrestricted by geo-blocks. Considering that93%of data breaches could have been avoided through basic data security measures, if you havent been keeping your browsing information, online banking and personal data secure through a VPN yet - now is the time to do something about it.

Thats because today, weve found an awesome deal with one of the absolute best VPN services on the internet. Right now, you can grab alifetime subscription to BelkaVPNon sale for just $39.99 - thats a massive 94% discount off the regular price of $719.

Belka protects all your private data thanks to its encryption shield tunnel, which secures your activity even when using public networks - and it comes with a zero log policy, and no speed or bandwidth restrictions, so your connection will always be fast and secure.

Belka uses more than 120 VPN servers across 25 global locations - keeping your information private, and meaning that you can bypass online region restrictions, too. Excitingly for entertainment fans, that opens up a whole new world of international TV, movie, and music streaming - allowing you to access the servers for US and international Netflix, BBC iPlayer, Hulu, ESPN+ and HBO, and over 40 other streaming services. So, whether you want to always be able to watch US shows wherever you travel, or you want to enjoy international shows you cant see on regular US channels, youll never be unable to watch anything by location - ever again.

This VPN comes extremely highly rated, too, with 4.1/5 stars on theGoogle Play storeand a 4.2/5 star rating on Trustpilot. Its no surprise the service is loved by its users. As one recently wrote, I have been looking for a good and secure VPN and this VPN has everything.

Ready to secure your private data for life? Get yourBelkaVPN: Lifetime Subscriptionon sale with 94% off right now, for $39.99 (reg. $719).

Read the rest here:
Protect Your Privacy For Life With This VPN Loved By Its Users, Now Less Than $40 - IGN Nordics

Read More..

EXCLUSIVE Microsoft warns thousands of cloud customers of exposed databases – Reuters

SAN FRANCISCO, Aug 26 (Reuters) - Microsoft (MSFT.O) on Thursday warned thousands of its cloud computing customers, including some of the world's largest companies, that intruders could have the ability to read, change or even delete their main databases, according to a copy of the email and a cyber security researcher.

The vulnerability is in Microsoft Azure's flagship Cosmos DB database. A research team at security company Wiz discovered it was able to access keys that control access to databases held by thousands of companies. Wiz Chief Technology Officer Ami Luttwak is a former chief technology officer at Microsoft's Cloud Security Group.

Because Microsoft cannot change those keys by itself, it emailed the customers Thursday telling them to create new ones. Microsoft agreed to pay Wiz $40,000 for finding the flaw and reporting it, according to an email it sent to Wiz.

"We fixed this issue immediately to keep our customers safe and protected. We thank the security researchers for working under coordinated vulnerability disclosure," Microsoft told Reuters.

Microsoft's email to customers said there was no evidence the flaw had been exploited. "We have no indication that external entities outside the researcher (Wiz) had access to the primary read-write key," the email said.

This is the worst cloud vulnerability you can imagine. It is a long-lasting secret, Luttwak told Reuters. This is the central database of Azure, and we were able to get access to any customer database that we wanted.

Luttwak's team found the problem, dubbed ChaosDB, on Aug. 9 and notified Microsoft Aug. 12, Luttwak said.

A Microsoft logo is pictured on a store in the Manhattan borough of New York City, New York, U.S., January 25, 2021. REUTERS/Carlo Allegri

Read More

The flaw was in a visualization tool called Jupyter Notebook, which has been available for years but was enabled by default in Cosmos beginning in February. After Reuters reported on the flaw, Wiz detailed the issue in a blog post.

Luttwak said even customers who have not been notified by Microsoft could have had their keys swiped by attackers, giving them access until those keys are changed. Microsoft only told customers whose keys were visible this month, when Wiz was working on the issue.

Microsoft told Reuters that "customers who may have been impacted received a notification from us," without elaborating.

The disclosure comes after months of bad security news for Microsoft. The company was breached by the same suspected Russian government hackers that infiltrated SolarWinds, who stole Microsoft source code. Then a wide number of hackers broke into Exchange email servers while a patch was being developed.

A recent fix for a printer flaw that allowed computer takeovers had to be redone repeatedly. Another Exchange flaw last week prompted an urgent U.S. government warning that customers need to install patches issued months ago because ransomware gangs are now exploiting it.

Problems with Azure are especially troubling, because Microsoft and outside security experts have been pushing companies to abandon most of their own infrastructure and rely on the cloud for more security.

But though cloud attacks are more rare, they can be more devastating when they occur. What's more, some are never publicized.

A federally contracted research lab tracks all known security flaws in software and rates them by severity. But there is no equivalent system for holes in cloud architecture, so many critical vulnerabilities remain undisclosed to users, Luttwak said.

Reporting by Joseph Menn; Editing by William Mallard

Our Standards: The Thomson Reuters Trust Principles.

Read the rest here:
EXCLUSIVE Microsoft warns thousands of cloud customers of exposed databases - Reuters

Read More..

Monday: Hardware & consumption boom, Bitcoin theft, cloud & T-Mobile gaps – Market Research Telecast

Processors and graphics cards continue to sell well, but in the context of the energy transition and the call for more sustainability, electric cars, new house insulation and organic shoes are also in demand. But does that have to be all? It is also environmentally friendly if you continue to use your previous belongings instead of replacing them a brief overview of the most important messages.

Although the second quarter has traditionally been rather weak, the three big chip manufacturers have Intel, Nvidia and AMD further increased their sales figures. Despite the lack of chips, sales of CPUs and graphics cards continue to rise. Intel was able to assert itself as the market leader, but Nvidia increased its market share for graphics cards a little.

These Consumer culture is also evident in other areas. A survey on the personal participation in the protection of future human habitat shows what the respondents bought. New electric cars. New house insulation. New organic shoes. New e-bikes. New bamboo straws. New zinc sheet watering cans. the The downside is missing, the garbage behind it. The Missing Link is about overconsumption and false consumption promises: Dont buy an electric car!

Not bought, but supposedly with one Malware steals bitcoins did two years ago british youngsters. With a Civil action an American wants to regain these 16 bitcoins. At the time of the theft, the two alleged perpetrators were still minors and lived with their parents. After the loss of 16 bitcoins, the stolen person also sued the parents of the alleged thieves.

Microsofts cloud service Azure was not infested with malware, but apparently offered one security breach, through which unauthorized persons could gain full access to the customers cloud databases. Microsoft says it has closed the gap in the meantime, but affected customers should take action themselves to prevent unauthorized access. After the cloud database disaster, Microsoft has therefore informed its Azure customers about the serious gap.

In contrast to the Azure vulnerability, which has not had any consequences so far, the most recent Break into the servers of T-Mobile US above 50 million customer data stolen. The system made it easy for the hacker, he himself explains in a letter to the press. Cracking the defense mechanisms of the US Telecom subsidiary cost him little effort. The hacker used a devastating security hole for the data breach at T-Mobile US.

A devastating development is also becoming apparent in the coronavirus pandemic, because the number of nationwide Covid-19 patients treated in intensive care units is in the fourth corona wave rose above 1000 for the first time. In the DIVI Register daily report on Sunday, 1008 Covid 19 patients were reported in intensive care, 485 of whom had to be ventilated. The low was 354 on July 22nd. Since then, the occupancy has increased again, so that the number of Covid-19 patients in intensive care units has risen again to over 1000.

Also important:

(fds)

Article Source

Disclaimer: This article is generated from the feed and not edited by our team.

Read the original:
Monday: Hardware & consumption boom, Bitcoin theft, cloud & T-Mobile gaps - Market Research Telecast

Read More..

Rethinking Your Tool Chain When Moving Workloads to the Cloud – Virtual-Strategy Magazine

Software-driven IT organizations generally rely on a tool chain made up of commercial and home-grown solutions to develop, deploy, maintain and manage the applications and OSes that their business depends on. Most IT shops have preferred tools for needs like application monitoring, data protection, release management or provisioning and deprovisioning resources. But are those tools always the best options?

While tool chains do evolve over time, its rare for IT organizations to conduct a full, top-to-bottom review of the tools they are comfortable using with an eye toward optimization or new capabilities. One motivator is when companies are considering moving workloads from the data center to the cloud. The inherent differences in how applications are developed and managed for on-premise vs. cloud environments is a strong reason to reassess whether the current tools in your arsenal are the best alternatives available or, just as important, whether theyre well-suited to a more cloud-centric software lifecycle.

When it comes to reevaluating your tool chain, it helps to have a process. Heres one approach:

Its important to start with a full audit of your current stack, including areas such as:

Obviously, its important to assess how well each product meets your current needs as they stand today. (Are there capabilities you wish it had or weaknesses youve become accustomed to working around?) Then consider how those needs will change as workloads move to the cloud. A good first question to ask is whether the tool is still supported by the vendor. Given how infrequently IT teams switch tools, theres a not insignificant likelihood that one or more of your tools has become an orphan. Second, does the license agreement for the tool accommodate or restrict its use in the cloud? For instance, some tools are licensed to a specific physical server and some vendors require their hardware to be owned by the same entity that holds the license. Both of these scenarios are problematic for cloud-based deployments. Third, does moving to a cloud base tool open up new possibilities that you want to take advantage of? Removing the constraints of on-premise solutions and gaining capabilities like nearly unlimited compute and storage, dynamic workloads and multiple regions around the world can provide much needed flexibility. But the advantage of moving to a cloud-based tool (replacing, say, an on-premise application log reporting solution with Azure Log Manager), needs to be balanced against the added management required solution as well as the need to retrain teams.

There are also non-technical factors to consider when looking at new tools. Do teams enjoy using the tool? Does it make them more productive (or conversely, slow them down)? Does it meet the business needs of the organization? How much work will adopting a new tool take and will it be worth it in the long run? While these may not be the most important considerations, they shouldnt be overlooked.

There is almost always going to be an alternative to any individual tool and, potentially, one tool that can do the work of several, making it possible to consolidate. One way to get a sense of whats available is to start by asking other teams in your organization what they use in the destination cloud. There are often cloud-based tools (offered by cloud vendors or sold as separate SaaS products) that offer pay-as-you-go licensing, can be easier to scale up or down, move workloads around, and can expand to other regions. Today, some legacy vendors even offer consumption-based options to better match up against cloud-based competitors, while others stick with more traditional perpetual licenses. Last, consider if a new tool will give IT teams the opportunity and motivation to expand their skill set. Offering the chance to learn and use new products could actually increase job satisfaction and improve your organizations ability to retain engineering talent.

Before you pull the trigger on a new solution it often pays to check in with the existing vendor. To keep your business they may offer more generous or flexible terms. Of course, vendors that see the cloud as a threat are probably going to be less inclined to give you a break on licensing. But the conversation doesnt lead to new or better terms, talking to your vendors on a regular basis can provide insights into how they see their customers and the market.

Once youve completed the previous steps, youll have a good idea of the tools youre likely to keep and those youd like to upgrade. At that point, its important to create a plan for adopting each new tool. Start by separating products that need to be replaced soon from those where more research is required; it also helps to compile any other useful information learned during the process so that the larger IT team can access it. Youll want to assess whether teams will need training, whether internal documentation or playbooks need to be updated, and how new tools will plug into existing authorization/authentication solutions. Finally, you will also need a migration plan for each tool that details how and when the organization will move from the old product to the new one, what scripts will need to be rewritten, and what to do with historical data like log files from the old product.

While cloud-based tools offer meaningful benefits in terms of flexibility, cost savings and ease of scalability, they may not be the best solution for every organization. The only way to be sure is to do the kind of analysis outlined above. For companies that have already made the decision to move workloads to the cloud, the potential long-term benefits of adopting new solutions is worth the effort.

Skytap

More:
Rethinking Your Tool Chain When Moving Workloads to the Cloud - Virtual-Strategy Magazine

Read More..

GraphQL’s Emerging Role in Modernization of Monolithic Applications – IT Jungle

August 30, 2021Alex Woodie

Every now and then, a technology emerges that lifts up everything around it. GraphQL has the potential to be a technology like that, and thats good news for customers running older, established applications, such as those that run on IBM iron.

IBMs mainframe and its midrange IBM i server often are maligned as old, washed-up, legacy platforms, but people who say that are missing a key distinction: its usually not the server thats old. In most cases, its the application that was first deployed in the Clinton Administration (or earlier) that is the problem.

Companies typically have many reasons for why they havent modernized applications or migrated to something newer. For starter, the applications just work. And, despite the changing technological winds around us, the fact that these applications continue to do what they were originally designed to do process transactions reliably and securely, day in and day out, often for years on end, without much maintenance is not a trivial thing, nor is it something to fool around with. If it aint broke, dont fix it probably best encapsulates this attitude.

You were supposed to abandon these monolithic applications in favor of client-server architectures 30 years ago. In the late 1990s, you were supposed to migrate your RPG and COBOL code to Java, per IBMs request. The explosion of the World Wide Web in the early 2000s provided another logical architecture to build to, followed by the growth of smart devices after the iPhones appearance in 2007. Today, everybody aspires to run their applications as containerized microservices in the cloud, which surely is the pinnacle of digital existence.

After all these boundary-shaking technological inflection points, its a wonder that mainframes and IBM i servers even at exist at this point. But of course, despite all the best laid plans to accelerate their demise, they do (as you, dear IT Jungle reader, know all too well).

So what does all this have to do with GraphQL? We first wrote about the technology in March 2020, just before the COVID pandemic hit (so perhaps you missed it).

GraphQL, in short, is a query language and runtime that was originally created at Facebook in 2012 and open sourced in 2015. Facebook developers, tired of maintaining all of the repetitive and brittle REST code needed to pull data out of backend servers to feed to mobile clients, desired an abstraction layer that could insulate them from REST and accelerate development. The result was GraphQL, which continues to serve data to Facebooks mobile clients to this day.

Since it was open source, GraphQL adoption has grown exponentially, if downloads of the open source technology mean anything. According to Geoff Schmidt, the chief executive officer and co-founder of GraphQL-backer Apollo, says 30 percent of the Fortune 500 have adopted Apollo tools to manage their growing GraphQL estates.

Following Apollos recent Series D funding round, which netted the San Francisco company $130 million at a $1.5 billion valuation, Schmidt is quite excited about the potential for GraphQL to alleviate technical burdens for enterprises with lots of systems to integrate, including monolithic applications running on mainframes.

Frankly, there are great use cases around mainframe systems or COBOL systems, Schmidt says. You just slide this graph layer in between the mainframe and the mobile app, and you dont have to change anything. You just get that layer in there, start moving the traffic over to GraphQL and route it all through the graph.

Once the GraphQL layer is inserted between the backend systems and the front-end interfaces, front-end developers have much more freedom to develop the compelling app experiences, without going to the backend developer to tweak a REST API. In addition to accelerating the developers productivity, it also insulates the backend system from changes.

Once you put that abstraction layer in place, not only can you combine all that stuff and get it to every platform in a very agile, fast manner, Schmidt tells IT Jungle, but at the same time, if you want to refactor thatif you have a monolithic backend that you want to move into microservices, or you just want to change how the back is architected you can now do that without having to disturb all those clients that exist in the field.

Microservices and Web services that utilize the REST approach are the defacto standard in the industry at the moment. But that could change. Schmidt cites a recent survey that found 86 percent of JavaScript developers ranked GraphQL as a top interest.

This graph layer makes sense, Schmidt says. Its at the edge of your data center. Its the abstraction layer that you want to put around all your backend services, whether its to support front-end teams or to support partners. And partners are even a more powerful use case, because if they need a change to the API hey, that can be six months or a year.

One of Apollos customers is Walmart. The Arkansas-based retailer maintains systems for managing transactions in the store and on its ecommerce website. Using GraphQL, Walmart is able to deliver an incredible 360-degree customer experience, Schmidt says.

Whether the customer wants to shop in store or they want to shop online, were giving them the very best possible shopping experience, the CEO says. The customer is going to describe how they want to shop, not the retailer, and thats what Walmart is able to deliver with a graph that bring to get all the mainframe that power their brick-and-mortar stores together with all of their cutting-edge ecommerce investment to serve the customer wherever they are.

Walmart, of course, has powered its share of innovation in IT. While some details of the implementation are not available, the fact that the retail giant is adopting GraphQL to address its data and application integration requirements may tell you something about the potential of this technology in an IBM environment, particularly considering the rallying cry from Rochester over the need to build next-gen IBM i apps.

The way Schmidt sees it, GraphQL lets customers think about their businesses as platforms, as a bunch of capabilities that we can combine to meet customer needs, in any channel, anytime, he says.

IT leaders who put a graph strategy in place now, maybe even before the business realizes the need for it theyre the ones who are going to have this platform strategy, he continues. The IT leaders who put that in place are going to be heroes, because whatever the business asks for, theyre going to be to be able to deliver a year faster than the competition.

So You Want To Do Containerized Microservices In the Cloud?

Public Cloud Dreams Becoming A Reality for IBM i Users

In Search Of Next Gen IBM i Apps

Modernization Trumps Migration for IBM i and Mainframe, IDC Says

COVID-19: A Great Time for Application Modernization

How GraphQL Can Improve IBM i APIs

Excerpt from:
GraphQL's Emerging Role in Modernization of Monolithic Applications - IT Jungle

Read More..

Linux is not invulnerable, here are some top Linux malware in 2021 – Technology Zimbabwe

So yesterday I wrote about the latest iteration of Ubuntu 20.04 LTS coming out in my usual glowing terms. I feel like there was nothing amiss in that article after all Ubuntu, especially the version in question, is a stellar operating system that is rock solid and has served me well. A few people however decided to call me on my bias and asked me to publicly admit that there is no such thing as an invulnerable operating system under the sun.

So here is me doing exactly that. I think I should repeat that for emphasis: There is no such thing as an invulnerable operating system under the sun. I often say the best way to make your computer impenetrable is to shut it down and pulverise it thoroughly with a hammer. But even then who knows? I have seen FBI nerds in real movies pull information on a single surviving chip.

What makes Linux better than Windows in my opinion is not just the open-source code that is reviewed by scores of experts around the world. Its the philosophy behind it all. In Windows, ignorant users can click around and blunder the way to productivity. The system is meant to be easy and fits many use cases by default. All you need to do is boot up, enter your password or just stare at your computer to login, get to the desktop and click on Chrome and you are watching cat videos.

In Linux, things can be but are usually not that easy. While you can use Windows without knowing what a registry is. In Linux, you have to be hands-on with your configurations. Every action you take has to be deliberate otherwise your risk breaking things. Often you have to set up your desktop the way you want, Chrome is not installed by default and sometimes you cannot even play videos until you install the right codecs. Linux forces you to learn and pay attention to what you are doing. You are often forced to learn why you are doing things in addition to how to do things.

Now that we have put the explanations out of the way its time to look at some of the top Linux Malware in 2021. One thing to note is that cloud-centric malware dominates in Linux. There are probably a couple of reasons for this including:

Below are the top malware in Linux according to Trend Micro

One thing to note from the above is that unlike in Windows, Linux malware is often heavily customised by attackers to target a specific vulnerability and often each Linux system is unique. This means that its rare to see one specific piece of malware dominate instead you have families of related malware.

Again I am biased but I believe identifying and thwarting an attack in Linux is pretty easy. You have tools like UFW (or better yet iptables) to lock down your internet connection in ways that are unimaginable in Windows. For example, whenever I set up a new cloud server I simply block all non-Zimbabwean IPs by default. That alone removes 99.99% of the threats from the table.

Also, make it a habit to uninstall software you dont need. Better still when installing make sure you only install the base operating system with as little stuff as possible. You can then add only just the stuff you need. Why install Apache on a Minecraft or mail server? Do you really need FTP? If not stop and disable the service via ssh.

Above all. Always check the logs. Always. Check resource usage too and see if it tallies with what you expect.

Follow this link:
Linux is not invulnerable, here are some top Linux malware in 2021 - Technology Zimbabwe

Read More..

Here’s Why Nvidia Will Surpass Apple’s Valuation In 5 Years – Forbes

POLAND - 2021/02/05: In this photo illustration a Nvidia logo seen displayed on a smartphone screen ... [+] with stock market graphic on the background. (Photo Illustration by Omar Marques/SOPA Images/LightRocket via Getty Images)

Nvidia has a market cap of roughly $550 billion compared to Apples nearly $2.5 trillion. We believe Nvidia can surpass Apple by capitalizing on the artificial intelligence economy, which will add an estimated $15 trillion to GDP. This is compared to the mobile economy that brought us the majority of the gains in Apple, Google and Facebook, and contributes $4.4 trillion to GDP.For comparison purposes, AI contributes $2 trillion to GDP as of 2018.

While mobile was primarily consumer, and some enterprise with bring-your-own-device, artificial intelligence will touch every aspect of both industry and commerce, including consumer, enterprise, and small-to-medium sized businesses, and will do so by disrupting every vertical similar to cloud. To be more specific, AI will be similar to cloud by blazing a path that is defined by lowering costs and increasing productivity.

I have an impeccable record on Nvidia including when I stated the sell-off in 2018 was overblown and missing the bigger picture as Nvidia has two impenetrable moats: developer adoption and the GPU-powered cloud. This was when headlines were focused exclusively on Nvidias gaming segment and GPU sales for crypto mining.

Although Nvidias stock is doing very well this year, this has been a fairly contrarian stance in the past. Not only was Nvidia wearing the dunce hat in 2018, but in August of 2019, the GPU data center revenuewas flat to declining sequentially for three quarters, and in fiscal Q3 2020, also declined YoY (calendar Q4 2019). We established and defended our thesis on the data center as Nvidia clawed its way back in price through China tensions, supply shortages, threats of custom silicon from Big Tech, cyclical capex spending, and on whether the Arm acquisition will be approved.

Suffice to say, three years later and Nvidia is no longer a contrarian stock as it once was during the crypto bust. Yet, the long-term durability is still being debated - its a semiconductor company after all - best to stick with software, right? Right? Not to mention, some institutions are still holding out for Intel. Imagine being the tech analyst at those funds (if theyre still employed!).

Before we review what will drive Nvidias revenue in the near-term, it bears repeating the thesis we published in November of 2018:

Nvidia is already the universal platform for development, but this wont become obvious until innovation in artificial intelligence matures. Developers are programming the future of artificial intelligence applications on Nvidia because GPUs are easier and more flexible than customized TPU chips from Google or FGPA chips used by Microsoft [from Xilinx]. Meanwhile, Intels CPU chips will struggle to compete as artificial intelligence applications and machine learning inferencing move to the cloud. Intel is trying to catch-up but Nvidia continues to release more powerful GPUs and cloud providers such as Amazon, Microsoft and Google cannot risk losing the competitive advantage that comes with Nvidias technology.

The Turing T4 GPU from Nvidia should start to show up in earnings soon, and the real-time ray-tracing RTX chips will keep gaming revenue strong when there is more adoption in 6-12 months. Nvidia is a company that has reported big earnings beats, with average upside potential of 33.35 percent to estimates in the last four quarters. Data center revenue stands at 24% and is rapidly growing. When artificial intelligence matures, you can expect data center revenue to be Nvidias top revenue segment. Despite the corrections weve seen in the technology sector, and with Nvidia stock specifically, investors who remain patient will have a sizeable return in the future.

Notably, the stock is up 335% since my thesis was first published a notable amount for a mega cap stock and nearly 2-3X more returns than any FAAMG in the same period. This is important because I expect this to trend to continue until Nvidia has surpassed all FAAMG valuations.

Although Nvidias stock is doing very well this year, this has been a fairly contrarian stance in ... [+] the past.

Below, we discuss the Ampere architecture and A100 GPUs, the Enterprise AI Suite and an update on the Arm acquisition. These are some of the near-term stepping stones that will help sustain Nvidias price in the coming year. We are also bullish on the Metaverse with Nvidia specifically but will leave that for a separate analysis in the coming month.

Nvidias acceleration may happen one or two years earlier as they are the core piece in the stack that is required for the computing power for the front-runners referenced in the graph above. There is a chance Nvidia reflects data center growth as soon as 2020-2021. -published August 2019, Premium I/O Fund

Last year, Nvidia released the Ampere architecture and A100 GPU as an upgrade from the Volta architecture. The A100 GPUs are able to unify training and inference on a single chip, whereas in the past Nvidias GPUs were mainly used for training. This allows Nvidia a competitive advantage by offering both training and inferencing. The result is a 20x performance boost from a multi-instance GPU that allows many GPUs to look like one GPU. The A100 offers the largest leap in performance to date over the past 8 generations.

At the onset, the A100 was deployed by the worlds leading cloud service providers and system builders, including Alibaba cloud, Amazon Web Services, Baidu Cloud, Dell Technologies, Google Cloud platform, HPE and Microsoft Azure, among others. It is also getting adopted by several supercomputing centers, including the National Energy Research Scientific Computing Center and the Jlich Supercomputing Centre in Germany and Argonne National Laboratory.

One year later and the Ampere architecture is becoming one of the best-selling GPU architectures in the companys history. This quarter, Microsoft Azure recently announced the availability of Azure ND A100 v4 Cloud GPU which is powered by NVIDIA A100 Tensor Core GPUs. The company claims it to be the fastest public cloud supercomputer. The news follows the launch by Amazon Web Services and Google Cloud general availability in prior quarters. The company has been extending its leadership in supercomputing. The latest top 500 list shows that Nvidia power 342 of the worlds top 500 supercomputers, including 70 percent of all new systems and eight of the top 10. This is a remarkable update from the company.

Ampere architecture-powered laptop demand has also been solid as OEMs adopted Ampere Architecture GPUs in a record number of designs. It also features the third-generation Max-Q power optimization technology enabling ultrathin designs. The Ampere architecture product cycle for gaming has also been robust, driven by RTXs real-time ray tracing.

In the area of GPU acceleration, Nvidia isworking with Apache Sparkto release Spark 3.0 run onDatabricks. Apache Spark is the industrys largest open source data analytics platform. The results are a 7x performance improvement and 90 percent cost savings in an initial test. Databricks and Google Cloud Dataproc are the first to offer Spark with GPU acceleration, which also opens up Nvidia for data analytics.

The demand has been strong for the companys products which have exceeded supply. In the earnings call, Jensen Huang mentioned And so I would expect that we will see a supply-constrained environment for the vast majority of next year is my guess at the moment. However, he assured that they have secured enough supplies to meet the growth plans for the second half of this year when he said, We expect to be able to achieve our Company's growth plans for next year.

Virtualization allows companies to use software to expand the capabilities of physical servers onto a virtual system. VMWare is popular with IT departments as the platform allows companies to run many virtual machines on one server and networks can be virtualized to allow applications to function independently from hardware or to share data between computers. The storage, network and compute offered through full-scale virtual machines and Kubernetes instances for cloud-hosted applications comes with third-party support, making VMWare an unbeatable solution for enterprises.

Therefore, it makes sense Nvidia would choose VMWares VSphere as a partner on the Enterprise AI Suite, which is a cloud native suite that plugs into VMWares existing footprint to help scale AI applications and workloads. As pointed out by the write-up by IDC, many IT organizations struggle to support AI workloads as they do not scale as deep learning training and AI inferencing is very data hungry and requires more memory bandwidth than what standard infrastructures are capable of. CPUs are also not as efficient as GPUs, which have parallel processing. Although developers and data scientists can leverage the public cloud for the more performance demanding instances, there are latency issues with where the data repositories are stored (typically on-premise).

The result is that IT organizations and developers can deploy virtual machines with accelerated AI computing where previously this was only done with bare metal servers. This allows for departments to scale and pay only for workloads that are accelerated with Nvidia capitalizing on licensing and support costs. Nvidias AI Enterprise targets customers who are starting out with new enterprise applications or deploying more enterprise applications and require a GPU. As enterprise customers of the Enterprise AI Suite mature and require larger training workloads, its likely they will upgrade to the GPU-powered cloud.

Subscription licenses start at $2,000 per CPU socket for one year and it includes standard business support five days a week. The software will also be supported with a perpetual license of $3,595, but support is extra. You also have the option to have get 24x7 support with additional charges. According to IDC, companies are on track to spend a combined nearly $342 billion on AI software, hardware, and services like AI Enterprise in 2021. So, the market is huge and Nvidia is expecting a significant business.

Nvidia also announced Base Command, which is a development hub to move AI projects from prototype to production. Fleet Command is a managed edge AI software SaaS offering that allows companies to deploy AI applications from a central location with real-time processing at the edge. Companies like Everseen use these products to help retailers manage inventory and for supply chain automation.

Over the past year, there have been some quarters where data center revenue exceeded gaming, while in the most recent quarter, the two segments are inching closer with gaming revenue at $3.06 billion, up 85 percent year-over-year, and data center revenue at $2.37 billion, up 35 percent year-over-year.

It was good timing for Jensen Huang to appear in a fully rendered kitchen for the GTC keynote as professional visualization segment was up 156% year-over-year and 40% quarter-over-quarter. Not surprisingly, automotive was down 1% sequentially although up 37% year-over-year.

Gross margins were 64.8% when compared to 58.8% for the same period last year, which per management reflected the absence of certain Mellanox acquisition-related costs. Adjusted gross margins were 66.7%, up 70 basis points, and net income increased 282% YoY to $2.4 billion or $0.94 per share compared to $0.25 for the same period last year.

Adjusted net income increased by 92% YoY to $2.6 billion or $1.04 per share compared to $0.55 for the same period last year.

The company had a record cash flow from operation of $2.7 billion and ended the quarter with cash and marketable securities of $19.7 billion and $12 billion in debt. It returned $100 million to the shareholders in the form of dividends. It also completed the announced four-for-one split of its common stock.

The company is guiding for third quarter fiscal revenue of $6.8 billion with adjusted margins of 67%. This represents growth of 44% and with the lions share of sequential growth driven by the data center.

Weve covered the Arm acquisition extensively with in a full-length analysis you can find here on Why the Nvidia-Arm acquisition Should Be Approved. In the analysis, we point towards why we are positive on the deal, as despite Arms extremely valuable IP, the company makes very little revenue for powering 90% of the worlds mobile processors/smartphones (therefore, it needs to be a strategic target). We also argue that the idea of Arm being neutral in a competitive industry is idealistic, and to block innovation at its most crucial point would be counterproductive for the governments reviewing the deal. We also discuss how the Arm acquisition will help facilitate Nvidias move towards edge devices.

In the recent earnings call, CFO Colette Kress reiterated that the Arm deal is a positive for both the companies and its customers as Nvidia can help expand Arms IP into new markets like the Data Center and IoT. Specifically, the CFO stated, We are confident in the deal and that regulators should recognize the benefits of the acquisition to Arm, its licensees, and the industry.

The conclusion to my analysis is the same as the introduction, which is that I believe Nvidia is capable of out-performing all five FAAMG stocks and will surpass even Apples valuation in the next five years.

As stated in the article, Beth Kindig and I/O Fund currently own shares of NVDA. This is not financial advice. Please consult with your financial advisor in regards to any stocks you buy.

Please note: The I/O Fund conducts research and draws conclusions for the Funds positions. We then share that information with our readers. This is not a guarantee of a stocks performance. Please consult your personal financial advisor before buying any stock in the companies mentioned in this analysis.

Follow me onTwitter.Check outmywebsiteorsome of my other workhere.

The rest is here:
Here's Why Nvidia Will Surpass Apple's Valuation In 5 Years - Forbes

Read More..