Category Archives: Cloud Servers

Experts discuss industry response to multicloud – IT Brief New Zealand

Article by NetEvents editor Lionel Snell

When everyone first started talking about the cloud, it looked as if the pendulum might be swinging back towards a client/server situation, with the cloud being the server to a worldwide population of relatively thin clients.

Big cloud providers encouraged that model: give us your data and we will sell you access to our services.

But that one cloud has evolved from being a single entity to a broad concept, one that included private as well as public clouds, and then the inevitable hybrid clouds incorporating both public and private clouds.

Now we have multicloud is this just a new name for a hybrid cloud?

As I understood it, the difference should be that a hybrid cloud means an interconnected combination of public and private clouds, so that they become one integrated whole, whereas a multicloud means relying on several cloud services from several vendors for business purposes.

But the two are not distinct: for example, Dell EMC cloud solutions offer to transform IT by leveraging a multicloud approach spanning a variety of public, private and hybrid resources.

And IBMs multicloud solutions page says: Multicloud is a cloud adoption strategy that embraces a mix of cloud models public, dedicated, private, managed to best meet unique business, application and workload requirements.

Wikibon chief research officer and general manager Peter Burris says: The fundamental business objective is to use data as an asset... digital business is about how we are going to put data to work differently.

In particular, data is being used to further the current trend for transforming products into services: and that is just what cloud is already doing in the IT industry an important point because it means that the way the cloud is developing now could be a pattern for the way future businesses will develop.

Instead of repeating the usual clich about data being the new oil, he pointed out what a lousy analogy that was: Data is easily copied. It's easily shared. It's easily corrupted. It does not follow the laws of scarcity and that has enormous implications, certainly for all the vendors on the panel and virtually every enterprise on the planet.

Seeing cloud development as a roadmap for broader, longer-term tech-industry trends does make this a vital topic, and it emphasises the point that the cloud is not about centralising computing on a massive scale, but about creating simpler, more powerful distributed computing.

Rather than pass our data up into some providers cloud, we rather keep the data in place: where it is gathered, where it is most secure, where intellectual property is easiest to protect, and where the actual business takes place.

This is not about moving data into the cloud. This is about moving the cloud and cloud services to the data. Within 10 years the cloud is going to reflect a natural organisation of data, whether it's at the edge, whether it's in the core or whether it's in public cloud attributes.

Cisco cloud platforms and solutions group product management VP Jean-Luc Valente points out that it was one thing to upload a terabyte of data to a cloud, but as the surge in data and applications rises towards exabytes, he says it would cost $30 million to upload just one of those to a public cloud.

This explosion of data at the edge is very serious from a networking and security angle.

Over the decades, networking has evolved from being a means to connect devices, to connecting sites, and connecting pages and individuals on social media so is it moving towards primarily connecting data and services?

According to NetFoundry CEO Galeal Zino, Now that the application is the new edge, and data is everywhere, we actually need to reinvent networking and the ecosystems around networking to match that new reality.

NetScout strategic alliances area VP Michael Segal referenced recent discussions about AI and machine learning using data to train automatic processes.

A lot of this would require analysing data in real-time, so edge computing becomes very important. The data needs to be close to where it's being analysed and where it provides insight in real-time.

Burris emphasises the increasingly critical role of the network: the actual training algorithms they use date way back before 2000, it was just that until recently there wasnt the parallel computing capability to put them to work effectively.

Apstra CEO & founder Mansour Karam is another advocate for this exciting time to be in networking.

He says: Managing networks like before no longer works. You can't manage networks manually by configuring devices by hand. It has to be done through software. It has to be done through powerful automation. You have to have the ability to abstract out all of those network services across all of those domains and you have to have the ability to operate these networks, enforce those policies, set these configurations and verify them remotely in every location where data resides.

So the importance of the multicloud is not where the data lies, but how it is managed in an agile manner by leveraging service mesh technology, applying containers, DevOps, or DevSecOps and once we can manage the edge with that same level of agility and automation, all of a sudden the data and the applications will exist wherever they are best put.

Segal compares this spread to the architecture of the modern data centre, where: A lot of server farms and a lot of east, west traffic and containers and virtualised environments in the data centre itself.

Then you extend it, not necessarily immediately to the public cloud - in some cases to private clouds such as Equinix.

Then you can have several different public cloud providers - Oracle, AWS, Microsoft Azure - and think about the complexities associated with connecting everything, many of them are edge computing environments.

Another point Burris made is that there has been a lot of emphasis on the data explosion, but what about the attendant software explosion as we move into a realm where these distributed services are accessed as both applications and data? Automation and abstraction require software, entities will be defined and policies enforced in software.

There's going to be an enormous explosion in the amount of software that's being generated over the next few years.

But is that the real business issue?

Oracle Vice President Jon Mittelhauser works mostly with Fortune 1000 companies and government departments where a lot of our value add is the fast connection between customer data centres. The data can live in either place but I agree that it's the key asset.

For most companies, their data is their asset, not the software. Here in Silicon Valley, the software is highly valued, but outside of Silicon Valley it's the data, or what they do with the data, which software helps you with.

Mansour Karam sees a transition from the days when one began by partnering with hardware vendors.

Once the hardware was agreed, then one decided what software to use.

But in a software-first world, you would be limited to those software offerings that that particular hardware vendor supports. In this new world, they start by partnering strategically with software vendors to define this layer of software first, this service layer. Once they've done that, they can go on and shop for hardware that specifically meets their needs.

To sum up, Peter Burris emphasises three key points:

Original post:
Experts discuss industry response to multicloud - IT Brief New Zealand

Data Governance And Smart Cities Are Helping Improve Quality Of Life In Japan – Forbes

In 2015, Kakogawa City had the third-worst crime rate in Hyogo, a prefecture in western Japan neighboring Osaka. Local authorities decided to implement a smart networked camera and sensor system they call mimamori, which means to watch over someone. With this, residents can monitor their children and elderly relatives. The system helps ensure their safety and security while protecting their data and privacy. Its one of the latest examples of how smart cities and data governance are helping improve society in Japan.

Fighting crime in the smart city

Located on the Seto Inland Sea about 30 km west of Kobe, Kakogawa is a city of about 264,000 people. To meet residents strong demands for safe streets, the municipal government worked with the Ministry of Internal Affairs and Communications and private businesses, including NEC and Nikken Sekkei Research Institute, to launch the mimamori system.

Municipalities in Japan and overseas are turning to smart city solutions to address social issues, prompting an explosion in data generated by cameras and other sensors.

In 2017 and 2018, the city installed about 1,500 networked cameras mainly around schools and school routes. About 2,000 sensors were also installed, both in fixed locations and on 265 government vehicles and 176 Japan Post motorcycles. The system is able to detect residents carrying Bluetooth Low Energy tags to confirm their location. The city is using FIWARE, a framework of open-source components to power smart cities and protect their data. The data is uploaded to cloud servers and the information is made available to volunteers and family members via the Kakogawa App.

The networked camera and sensor system has already had an effect. Aside from making residents feel more secure about their loved ones, the crime rate in Kakogawa fell below the Hyogo Prefecture average for the first time in November 2018.

We built an environment in which children and elderly people can be monitored by the local community, says Nishimori Yoko, a Kakogawa City official. The system can be effective in the event of an emergency. We have had several cases of missing people who were located quicker compared to before the system was deployed.

Some residents were worried about leaks of images from the system containing personal information, but the city has emphasized its policies on privacy and data governance. To assuage public concern about the new system, Kakogawa Mayor Yasuhiro Okada visited 12 sites and briefed members of the public. In a survey of 862 residents, over 98% responded that the system was necessary or probably necessary.

Kakogawa City installed about 1,500 networked cameras mainly around schools and school routes. Residents can monitor loved ones locations via a city app.

Kakogawa is working with other municipalities around the world to promote smart city policies. Representatives joined the G20 Global Smart Cities Alliance Launch Event held in Yokohama in October 2019 under the aegis of the Cabinet Office of the Government of Japan and the World Economic Forum Centre for the Fourth Industrial Revolution Japan (C4IR Japan). Participants including representatives from cities such as Barcelona and Cincinnati discussed issues including the use of technology in smart cities. Fifteen cities announced the launch of theG20 Global Smart Cities Alliance on Technology Governance, which is focused on producing standards for connected devices in public spaces.

Protecting data in Society 5.0

Theres a smart city boom emerging in Japan now, and data governance is the lifeblood of smart cities, says C4IR Japan head Suga Chizuru, who spoke at the Yokohama event. Were doing this to benefit citizens by tackling technical unattractive issues.

Representatives from smart cities around the world joined the G20 Global Smart Cities Alliance Launch Event held in Yokohama in October 2019.

The Ministry of Economy, Trade and Industry chose Suga for C4IR Japan based on her outstanding performance. At the ministry, she organized a study group aimed at modernizing Japans financial regulations amid the rise of fintech. As she gave presentations on how Japan should embrace fintech, the group gained members and attention. Eventually, it helped get legislative reforms to accelerate the adoption of fintech on the agenda in Japan. At C4IR Japan, Suga is focused on facilitating global consensus around data governance, with expert groups on healthcare, smart cities and mobility.

As more and more municipalities in Japan and overseas turn to smart city solutions to address social issues, the volume of data being generated by cameras and other kinds of sensors is seeing explosive growth. Managing that data is becoming increasingly important amid the expansion of Society 5.0, defined by the Cabinet Office as a human-centered society that balances economic advancement with the resolution of social problems by a system that highly integrates cyberspace and physical space.

Data governance is also at the heart of policies being promoted by the Japanese government and its partners. Research firm Gartner defines data governance as the specification of decision rights and an accountability framework to ensure the appropriate behavior in the valuation, creation, consumption and control of data and analytics.

Thats one of the aims of C4IR Japan, which was established in 2018 in an unprecedented partnership between the WEF, the Japanese government and Japanese organizations and corporations. Its dedicated to maximizing the benefits of the Fourth Industrial Revolution and Society 5.0, which are periods of rapid change driven by progress in science and technology, by promoting open innovation and interoperability in policymaking.

No country has the best data governance solution, and were still in an exploration phase as we share knowledge for the best governance framework, says Suga Chizuru, head of C4IR Japan.

To emphasize that data governance must be a key priority in the Fourth Industrial Revolution, the center hosted a data governance conference in November 2018. Prime Minister Abe Shinzo followed up with a speech at the Davos Forum annual conference in January 2019, announcing that his administration would prepare the Osaka track for Data Governance. G20 leaders, together with World Trade Organization Director-General Roberto Azevdo, joined Abe during the G20 Osaka summit to discuss the importance of the digital economy. They adopted the Osaka Track on the digital economy to craft rules on governance in international data traffic under the motto Data Free Flow With Trust (DFFT). The new rules are designed to benefit individuals, businesses, organizations and even smart cities like Kakogawa.

No country has the best data governance solution, and were still in an exploration phase as we share knowledge for the best governance framework, says Suga. Flexible and appropriate data governance will enable societies to enjoy the fruits of the Fourth Industrial Revolution and redistribute its wealth.

To learn more about World Economic Forum Centre for the Fourth Industrial Revolution Japan, click here.

More:
Data Governance And Smart Cities Are Helping Improve Quality Of Life In Japan - Forbes

How the cloud went from 0 to 100 in ten years – Express Computer

Synergy Research Groups detailed review of enterprise IT spending over the last ten years shows that annual spending on cloud infrastructure services has gone from virtually zero to almost $100 billion. Meanwhile enterprise spending on data center hardware and software has been stagnant through much of the decade. Data center spending did jump in 2018 even though server unit shipments remained flat, thanks to more richly configured and higher-priced servers.

Despite that 2018 increase in data center spending, growth in cloud spending did not miss a beat in 2018 and then grew again by almost 40% in 2019. Over the whole decade, average annual spending growth for data center was 4% (mostly due to the first three years) and for cloud services was 56%. 2019 will mark the first time that enterprises spend more on cloud services (IaaS, PaaS and hosted private cloud) than they do on data center equipment.

Based on actual spending in Q1-Q3 and its forecast for Q4, Synergy projects that 2019 worldwide spending on data center hardware and software (comprising servers, storage, networking, security and associated software) will be over $93 billion. The major segments with the highest growth rates over the decade were virtualization software, Ethernet switches and network security. Server share of the total data center market remained steady while storage share declined. Synergy projects that 2019 worldwide spending on cloud infrastructure services will reach $97 billion. The major segments with the highest growth rates over the decade were mainly within PaaS especially database, IoT and analytics. IaaS share of the total held reasonably steady while managed private cloud service share declined somewhat.

The decade has seen a dramatic increase in computer capabilities, increasingly sophisticated enterprise applications and an explosion in the amount of data being generated and processed, pointing to an ever-growing need for data center capacity. However, over half of the servers now being sold are going into cloud providers data centers and not those of enterprises, said John Dinsdale, a Chief Analyst at Synergy Research Group. Over the last ten years we have seen a remarkable transformation in the IT market. Enterprises are now spending almost $200 billion per year on buying or accessing data center facilities, but cloud providers have become the main beneficiaries of that spending.

If you have an interesting article / experience / case study to share, please get in touch with us at [emailprotected]

Read the original post:
How the cloud went from 0 to 100 in ten years - Express Computer

The Benefits of Identity Management for Healthcare Businesses – Solutions Review

Identity Management for healthcare businesses offers not only an opportunity to fortify IT infrastructures. It can also help ensure compliance, specifically with the Health Insurance Portability and Accountability Act (HIPAA), and with the user experience. Therefore, your healthcare business needs to consider the benefits of identity management.

Here, we present the three major benefits of identity management for healthcare businesses, and how it all fits together. However, first, we need to address what makes healthcare identity management so complicated and challenging.

First, healthcare perhaps more than any other industry deals with constantly expanding business lines and mergers and acquisitions. This means improved and competitive service for patients, but it also means growing attack surfaces for threat actors.

Additionally, continually growing networks as seen in healthcare results in fragmented patient data. Often, this means sensitive data may exist in unsecured databases allowing for easy theft. Additionally, fragmented patient data can lead to redundant and unnecessary care, misdiagnosis, and incorrect medication.

So the need for centralized patient data doesnt just constitute a threat to databases and network security; it can also impact patients physical safety. Moreover, identity management for healthcare faces challenges typical for other enterprises expanding their IT infrastructures. Usually, these include dealing with on-premises applications, edge devices, and new cloud applications.

Finally, healthcare organizations need to deal with the erosion of the network perimeter and medical services devices such as medical IoT. So what can identity management for healthcare businesses actually do to solve these problems?

Obviously, the first benefit of identity management centers on cybersecurity. Hackers frequently target healthcare providers in part because these businesses rarely deploy proper cybersecurity protocols; in fact, according to Armis, WannaCry continues to wreak havoc on healthcare businesses even after the devastating 2017 wave. In other words, healthcare enterprises have not adapted proper cybersecurity protocols despite the known threats targeting them.

Therefore, your healthcare business needs a next-generation solution that repels hackers and maintains consistent access rules through authentication. Strong authentication not only stops hackers, but it also deters less experienced ones by demonstrating identity management awareness. Also, authentication can fortify and monitor web applications, cloud servers, and patient portalsthe various environments in which healthcare operates.

Further, next-generation identity management for healthcare enables your business to benefit from multifactor authentication (MFA). Multifactor authentication is unquestionably the strongest form of authentication available to enterprises of all sizes.

MFA doesnt just rely on passwords to verify users; this is just as well, as passwords prove easy to circumvent, guess, or otherwise subvert. Instead, MFA uses all of the tools available to create a barrier between the users access request and the data. Factors may include hard tokens, biometrics, geofencing, time of access request monitoring, and context.

For healthcare, frequent challenges in identity and access management include lifecycle management, governance, and multiple login points. While numerous next-generation identity management capabilities can help solve these challenges, single sign-on (SSO) can certainly solve the latter. SSO helps prevent multiple log-ins and thus multiple passwords that expands the attack surface.

If you work in healthcare, you care about HIPAA. This compliance mandate focuses on patient privacy and protections; it involves not only technical safeguards for patients but physical and administrative safeguards as well. In other words, HIPAA places a significant security burden on your healthcare organization.

However, this comes with good news; HIPAA compliance helps your business tap into markets that use electronic health records; the overwhelming majority of physicians and hospitals use electronic health records.

Thankfully, next-generation identity management for healthcare businesses can help you achieve HIPAA compliance. First, authentication and access management helps ensure patients data stays secure on your networks, fulfilling part of the mandate. Second, solutions with governance capabilities often feature out-of-the-box reporting and automated forms for HIPAA compliance.

Finally, identity management can actually help make the patient-user experience and ultimately their care better. Through identity federation, disparate databases containing patient information can be centralized and secured simultaneously. This can prevent the problems with fragmented patient data and making sure the information stays out of reach of hackers.

As stated above, single sign-on also avoids the need for continual authentication and log-ins. Identity management can optimize workflows, and assist with reviewing coverage, managing claims, and scheduling appointments. In short, this level of personalization should invoke a memory of customer identity and access management (CIAM); it helps to maintain consistent patient data and makes those patients feel individual and appreciated.

Check out our Identity Management Buyers Guide. We cover the top solution providers, their use cases, and key capabilities in detail.

Ben Canner is an enterprise technology writer and analyst covering Identity Management, SIEM, Endpoint Protection, and Cybersecurity writ large. He holds a Bachelor of Arts Degree in English from Clark University in Worcester, MA. He previously worked as a corporate blogger and ghost writer. You can reach him via Twitter and LinkedIn.

Related

See the original post here:
The Benefits of Identity Management for Healthcare Businesses - Solutions Review

Suspect Wanted in May Shooting Arrested in St. Cloud – WJON News

ST. CLOUD --The St. Cloud Police Department has arrested a man wanted in a home invasion in eastern Minnesota last May.

St. Cloud Assistant Police Chief Jeff Oxton said 19-year-old Nicholas James of St. Cloud was arrested on a warrant out of Chisago County just after 1:30 a.m. Wednesday. Oxton says James was arrested in the 1100 block of 13th Street South without incident.

James was wanted on robbery and attempted murder charges.

According to the Chisago County Sheriff's Office, three suspects broke into a home back on May 16th. Authorities say a 22-year-old victim was shot and wounded. The victim was airlifted to a Twin Cities hospital.

The other two suspects were brought into police custody shortly after the incident. The sheriff's office says the suspects and victim knew each other and do not believe the incident was random.

Have WJON News Sent to Your InboxSubscribe to the WJON Newsletter and have top local news headlines sent to your inbox every day.

See the original post here:
Suspect Wanted in May Shooting Arrested in St. Cloud - WJON News

Wyze data leak: Key takeaways from server mistake that exposed information from 2.4M customers – GeekWire

Seattle-area startup Wyze offers low-cost video security cameras and other IoT devices. (Wyze Photo)

Post updated at 6 p.m. on Dec. 29.

Seattle-area startup Wyze, a provider of home video cameras and other Internet of Things (IoT) devices, announced on Dec. 26 that it had been informed of a data leak that reportedly exposed the personal information of 2.4 million of its customers.

The problem arose from a new internal project to find better ways to measure basic business metrics like device activations, failed connection rates, etc., writes Dongsheng Song, Wyze co-founder and chief product officer, in the companys post.

We copied some data from our main production servers and put it into a more flexible database that is easier to query, he explains. This new data table was protected when it was originally created. However, a mistake was made by a Wyze employee on December 4th when they were using this database and the previous security protocols for this data were removed.

Founded in 2017 by a group of Amazon veterans, Wyze offers a series of low-priced cameras, plugs, bulbs and other smart-home devices. The company, based in Kirkland, Wash., has raised $20 million in venture capital. GeekWire has contacted Wyze for additional comment.

To Wyzes credit, it has been very detailed in describing what happened, when, why, how, and what the company is doing about it.

A post by Twelve Security claimed that the leaked data included the following:

Wyze quoted that list in its original post but added, We dont collect information about bone density and daily protein intake even from the products that are currently in beta testing.

In looking over this event, there are ten key security and privacy takeaways.

Wyze has been upfront about the manner in which it was informed of the leak, with little or no time to mitigate the problem before it was made public. ZDNets Catalin Cimpanu summed up the feelings of many (likely including Wyze) about whether this disclosure was responsible or not.

These are valid and reasonable concerns. As is often the case regarding the disclosure wars, there likely wont be any resolution, but instead a renewed airing of both sides of the argument. Those supporting the disclosure can and will say the information was public for a number of days and holding that information back prolongs the risk. Those against it will say this just wasnt enough time for the vendor to take action. Either way, this situation shows that the disclosure wars will continue so long as theres no collective agreement on how to handle these situations.

One thing to Wyzes credit: they clearly jumped on this fast once it broke. The companys post states: Immediately upon hearing about a potential breach, Wyze mobilized the appropriate developers and executives (CEO and CPO) to address the allegations.

It adds later, This means that all Wyze user accounts were logged out and forced to log in again (as a precaution in case user tokens were compromised as alleged in the blog post). Users will also need to relink integrations with The Google Assistant, Alexa, and IFTTT.

This level of response and these steps are reasonable to address the risks around potentially lost authentication tokens. These are also actions that will impose a burden on users.

Going back to our first point, people can and will argue how much of this response is due to the nature of the disclosure. But these are good, concrete steps, which put security ahead of ease-of-use: Wyze is risking user frustration for better security.

One thing that Wyze isnt doing, however, is forcing password resets on users. While Wyze has said that passwords werent stolen, its often hard to be certain. And if the current situation involving Amazons Ring has taught us anything, its that people are regularly reusing passwords, especially where IoT devices are concerned. Not forcing a password reset is missing an opportunity to be thorough in the response to improve overall customer security.

Ring has been in the news a lot lately for being hacked. As Ive noted, the nature of those hacks boil down to the inherent weakness of relying on passwords. This situation is different because its a leak of data held by Wyze. In fact, it even appears that password information wasnt involved.

In this case, even if youve used two-factor authentication (2FA), you still are at risk from this data breach.

If the Ring situation has reminded us of the risks of password reuse and the overall weakness of passwords as a security measure for IoT, this breach helps show us the risks inherent to losing the kind of data used byIoT and health-related devices in the home.

By their very nature, IoT devices are integrated into our most intimate spaces. Cameras in particular represent a major window into our most protected personal spaces, as weve seen in the reactions to the Ring situation.

Looking at the information thats potentially lost in this breach, we get a more concrete sense of IoT data breaches can mean in real terms.

In particular, Wyze notes that the data loss includes: List of all cameras in the home, nicknames for each camera, device model and firmware. WiFi SSID, internal subnet layout, last on time for cameras, last login time from app, last logout time from app.

This data is troubling because it can give very specific information that can be useful for real-world crime. People regularly name devices in ways that are descriptive for themselves, not expecting them to be publicly known. For example, people might name a camera in a childs room Bettys Room. Information like this can give an attacker information about who is in the house, where they might be and where cameras are going to be placed. All of this can be useful information for people who want to enter the home for malicious purposes.

One thing that Wyze has not recommended, which I would recommend, is that users rename their internal WiFi SSIDs, rename their cameras and potentially reposition those cameras. All these steps can mitigate the risks of that information now being publicly accessible.

Another piece of the exposed data is this: Height, Weight, Gender, Bone Density, Bone Mass, Daily Protein Intake, and other health information for a subset of users.

Wyze goes to some length to point out that this information lost only affects a very small subset of their users, specifically 140 external beta testers. Yes, that is a very small number of people. But the information thats was exposed is very sensitive and very personal health information. Its a reminder of the nature of the data thats being handled by IoT and health devices.

The similarities to the Capital One data breach are striking. In this case, as Wyze says: a mistake was made by a Wyze employee on December 4th when they were using this database and the previous security protocols for this data were removed.

While this isnt exactly the same thing that happened with Capital One, in both cases you have data that was accessible in the cloud without appropriate security protections due to human error. Its also notable that in both cases, auditing and monitoring failed to catch the misconfiguration.

Both of these cases are a reminder that, unfortunately, when things are deployed to the cloud, the risks of exposure and breach are frequently greater. And in terms of IT operations and practice, the controls and countermeasures often arent as robust and mature for cloud deployments as they are for traditional on premises deployments.

For startups, there are two lessons, as well. One is cautionary and the other potentially positive.

First the cautionary tale: speed kills.

Once again, to its credit, Wyze is open about what happened, and theres a very clear message for startups. From the companys posting: To help manage the extremely fast growth of Wyze, we recently initiated a new internal project to find better ways to measure basic business metrics like device activations, failed connection rates, etc. We copied some data from our main production servers and put it into a more flexible database that is easier to query.

Two things happened here that are common for startups. First, the company experienced sudden, fast growth. Second, it moved quickly to address the implications of the growth.

As noted above, it was during this fast move that, at some point, the security that had protected the data was removed by an employee.

Its great that Wyze was able to move fast to address issues related to their fast growth. But this is also a reminder that speed can kill. Mistakes happen when things move fast and theres little checking. This is a risk that all startups face and should be conscious of.

Of course, the speed that can kill you as a startup can also save you. The fast response that we see from Wyze is an example of the speed startups can achieve. Another positive aspect of this speed is shown in the statement that is going to bump up priority for user-requested security features beyond 2-factor authentication.

If we compare and contrast this with Rings response to its current situation, the difference is stark. Ring has made no announcements of any major plans to improve security capabilities in the wake of stories of Ring devices being hacked. By contrast Wyze has committed early and openly to reworking their prioritization of new user-requested security features.

Here too is another lesson for startups: use the speed and agility that being a startup gives you to move quickly to turn disadvantage into advantage.

In its post, Wyze very clearly refuted the claim that it is sending data to Alibabas cloud in China. A question and answer in the post speaks directly to this:

Is there validity to the claim that Wyze is sending user data to China?

Wyze does not use Alibaba Cloud. The claim made in the article that we do is false.

It goes on to note that the company has employees and manufacturers in China, but Wyze does not share user data with any government agencies in China or any other country.

The fact that this claim was made and Wyze feels a need to refute it points to another takeaway: there is an emerging, almost McCarthyite trend lately to imply or allege that tech companies with ties to China are storing data in China and/or sharing data with the Chinese government. Weve seen similar insinuations in regards to TikTok as well.

Partly, this represents the sort of speculation that can fill a vacuum when companies dont provide clear information themselves about where they store their data. A few years ago, people, especially in Europe, were concerned about data being stored in the United States and its possibly being subject to seizure under the Patriot Act. Now, people are concerned about data being stored in China and accessible by the government there.

One thing companies can do to mitigate this concern is to be open about where they store data.

Beyond that, though, there is clearly heightened concern now about data being stored and shared with China, and that concern is manifesting in claims and insinuations about data being stored or shipped there.

The Wyze breach is a serious one. And Wyze deserves credit for doing a lot of things right, quickly, in response. But as we dig into it more, we can see that this situation raises a number of issues around IoT devices, data storage, security and incident response.

We can all learn from this, which is one reason why its so good that the Wyze team has been open and up front about the situation: it helps the industry learn and grow collectively. And because Wyze is a startup, its experience and response has particular lessons for other up-and-coming companies in the IoT space.

Update: Wyze disclosed an additional issue in a Dec. 29 update to its post.

We have been auditing all of our servers and databases since then and have discovered an additional database that was left unprotected. This was not a production database and we can confirm that passwords and personal financial data were not included in this database. We are still working through what additional information was leaked as well as the circumstances that caused that leak.

Weve also clarified our post above to note that Wyze says it doesnt collect information about protein intake or bone density, contrary to a report that said such data was included in the leak.

Read more:
Wyze data leak: Key takeaways from server mistake that exposed information from 2.4M customers - GeekWire

Conquering the Cyber Security Challenges of the Cloud – CPO Magazine

Cloud computing has become a prevalent force, bringing economies of scale and breakthrough technological advances to modern organizations, but it is more than just a trend. Cloud computing has evolved at an incredible speed and, in many organizations, is now entwined with the complex technological landscape that supports critical daily operations.

This ever-expanding cloud environment gives rise to new types of risk. Business and security leaders already face many challenges in protecting their existing IT environment. They must now also find ways to securely use multiple cloud services, supported applications and underlying technical infrastructure.

The surge in business processes supported by cloud services has been well evidenced by organizations using cloud services store confidential data in the cloud environment. But when using cloud services, organizations are still unsure whether to entrust cloud service providers (CSPs) with their data. CSPs generally provide a certain level of security as substantiated by multiple surveys, but cloud-related security incidents do occur.

CSPs cannot be solely responsible for the security of their customers critical information assets. Cloud security relies equally on the customers ability to implement the right level of information security controls. Nevertheless, the cloud environment is complex and diverse, which hinders a consistent approach to deploying and maintaining core security controls. It is vital that organizations are aware of and fulfill their share of the responsibility for securing cloud services to successfully address the cyber threats that increasingly target the cloud environment.

As organizations acquire new cloud services, they typically choose these from a selection of multiple CSPs and therefore need to deal with a multi-cloud environment, which is characterized using two or more CSPs.

Before you continue reading, how about a follow on LinkedIn?

Organizations favor a multi-cloud environment because it allows them to pick and choose their preferred cloud services across different CSPs (e.g. AWS, Microsoft Azure, Google Cloud, Salesforce). However, each individual CSP adopts its own jargon, its own specific technologies and approaches to security management. The cloud customer therefore needs to acquire a wide range of skills and knowledge to use different cloud services from multiple CSPs securely.

Organizations require a range of different users to securely access cloud services from within the organizations network perimeter through secure network connections (e.g. via a gateway). However, organizations also need their cloud services to be accessed from outside the internal perimeter by business partners and users travelling off-site or working remotely, all connecting through a selection of secure network connections as dictated by the organization.

While CSPs provide a certain level of security for their cloud services, organizations need to be aware of their security obligations and deploy the necessary security controls. This requires organizations to understand and address the many security challenges presented by the complex and heterogeneous aspects of the cloud environment.

Our ISF members have identified several obstacles to operating securely in the cloud environment. The main challenges include:

The rapid explosion of cloud usage has accentuated these challenges and, in some instances, left organizations insufficiently prepared to tackle the security concerns associated with using cloud services.

Securing the use of cloud services is a shared responsibility between the CSP and the cloud customer. The security obligations incumbent on the CSP are to protect the multi-tenant cloud environment, including the backend services and physical infrastructure, as well as to prevent the commingling of data between different customers.

While the CSP maintains much of the underlying cloud infrastructure, the cloud customer is responsible for securing its data and user management. Whether the customers responsibility extends to performing security configurations for applications, operating systems and networking will depend on the cloud service model selected.

This shared responsibility for security can create confusion and lead to over-reliance on the CSP to mitigate threats and prevent security incidents. It is essential that the cloud customer does not depend wholly on the CSP to deploy the appropriate security measures, but clearly understands how responsibility for security is shared with each CSP in order to identify and deploy the requisite security controls to protect the cloud environment.

An organization using an on-premises IT data center will know exactly where its critical and sensitive data resides and can exert full control over the movement of its data. This helps considerably when implementing security controls, whereas in the cloud environment, data moves in and out of an organizations perimeter more freely. This can obscure where critical and sensitive data is located, and how it can be protected, which can hinder an organizations ability to effectively enforce the requisite security controls across all of its cloud services in line with compliance requirements.

While it is the cloud customers responsibility to ensure the security of its data in the cloud environment, the customers control over its data is intrinsically limited since the data is stored by an external party the CSP in an off-site location, often in a different country. Moreover, the CSPs will often leverage several data centers in geographically distinct locations to ensure the organizations data is stored on more than one server for reasons of resilience. This creates additional complexity in terms of managing data across borders, understanding where it is located at a given moment in time, determining the applicable legal jurisdiction and ensuring compliance with relevant laws and regulations an obligation that rests fully with the cloud customer, not the CSP.

Modern organizations must operate at a fast pace, delivering new products and services to stay ahead of the competition. Many are therefore choosing to move ever further towards cloud computing, as the elasticity and scalability offered by cloud services provide the desired flexibility needed to compete. For an organization to have confidence that it can move to the cloud whilst ensuring that vital technological infrastructure is secure, a robust strategy is required.

The cloud environment has become an attractive target for cyber attackers, highlighting the pressing need for organizations to enhance their existing security practices. Yet consistently implementing the fundamentals of cloud security can be a complicated task due to the diverse and expanding nature of the cloud environment.

This is but one of many challenges that organizations need to overcome to use cloud services securely. Organizations cannot rely purely on CSPs to secure their critical information assets but must accept their own share of responsibility. This responsibility calls for a combination of good governance, deployment of core controls and adoption of effective security products and services. Controls that cover network security, access management, data protection, secure configuration and security monitoring are not new to information security practitioners, but they are critical to using cloud services securely.

Moving forward, organizations can select from a variety of trends and technologies that will enable them to use cloud services securely from the adoption of new products to the embedding of improved processes, such as a focus on secure containers, where security is given greater emphasis during development.

Assuring that services are used securely will provide business leaders with the confidence they need to fully embrace the cloud, maximizing its potential and driving the organization forward into the future.

The rest is here:
Conquering the Cyber Security Challenges of the Cloud - CPO Magazine

City of Cottonwood now a certified sustainable business | Sedona.Biz – The Internet Voice of Sedona and The Verde Valley – Sedona.biz

City of Cottonwood is now a Certified Sustainable Business at the Conservationist/Bronze level.

Sedona AZ (January 1, 2020) The City of Cottonwood has worked diligently on behalf of its citizens and the Verde Valley to reduce water consumption. In 2005, they became the largest water provider in the Verde Valley after purchasing the six private water companies that previously had served the Cottonwood area. The city immediately began making repairs and upgrades to the water production and distribution system to improve its reliability and reduce the lost and unaccounted for water that was previously occurring.

They also installed 20 arsenic treatment systems to comply with new water quality standards for arsenic that went into effect in 2006. As a result of their efforts, the city is pumping about 30 percent less water today than the six private water companies were pumping in 2000 and their total gallons per capita per daily (GPCD) use has decreased from around 171 to 87; making it one of the lowest total GPCDs for a municipality in the State of Arizona.

The city is also diligently working to expand its use of reclaimed water by strategically installing purple pipe to accommodate the use of reclaimed water for irrigation at the large turf areas throughout the city. The city currently requires the use of reclaimed water for all construction use and delivers reclaimed water for irrigation to Cottonwood Ranch, plus the community garden, airport, and viticulture garden at Yavapai College. In the Spring of 2020, the city will begin delivering reclaimed water for irrigation to Riverfront Park, with plans to expand its use to the cemetery.

Cottonwood is also addressing energy use throughout the city. They installed solar systems for the Riverfront Reclamation Facility and Recreation Center pool, and now have LED lights in Old Town lamp posts, city hall, tennis courts, and the recreation center. The city is moving data storage from conventional servers to cloud servers and they are looking to replace conventional pool pumps with variable speed pumps. These combined efforts reduce costs and the consumption of energy from non-renewable resources.

The city has also taken steps to reduce the potential discharge and disposal of chemicals and pharmaceuticals into the environment by hosting household hazardous waste and pharmaceutical collection events. These events remove toxic chemicals and pharmaceuticals that may otherwise be flushed into the water system or end up in the wrong hands.

Cottonwood also has good employment practices. Employees have access to exercise facilities for physical health and to an Employee Assistance Program for mental health. They also do a lot to support the community including sponsoring Food for Finds, Neighborhood Officer Program, National Night Out, City Selfie Day, Toys for Tots, Thanksgiving Turkey Drive, and Steps to Recovery.

The City is currently working with the Sustainability Alliance to complete a 5-year sustainability plan. See who else is certified.

Go here to see the original:
City of Cottonwood now a certified sustainable business | Sedona.Biz - The Internet Voice of Sedona and The Verde Valley - Sedona.biz

A restaurant server got a massive tip and it is helping to change her life – KFOR Oklahoma City

(CNN) Its never a bad idea to start the New Year on a generous note.

Danielle Franzoni, a server in Alpena, Michigan, started hers on the receiving end of that generosity. She waited on a couple at the restaurant where she works during the final days of 2019.

Their bill was $23. They tipped a festive $2,020.

Happy New Year, the anonymous couple wrote on the bill. 2020 Tip Challenge.

Franzoni couldnt believe it. She asked her boss whether it was too good to be true, but the tip was legit and seasonally appropriate.

Things like this dont happen to people like me, she told the Alpena Times.

It had been a difficult year for Franzoni. She moved to Alpena to start over, she said, as a recovering addict whod lived in a homeless shelter.

But with her customers generosity, she could see the clouds starting to clear. She even moved into her own home the same week.

Im gonna build a future because of this, she told CNN affiliate WXYZ. My kids have a future, and I have a home. Its a really big deal.

Tipping servers for the New YearThe kind act that landed Franzoni $2,020 is similar to another tipping challenge Tip the Bill which took off in 2018. Customers were encouraged to tip 100% and surprise their servers.

It seems the only stipulation of the 2020 tipping challenge is to keep the year in the total.

If you partake in New Year tipping, you dont have to go big Franzoni told the Alpena Times she later tipped a server $20.20 on her dinner bill.

That was my pay-it-forward,' she said. I couldnt do the other one.

Read more from the original source:
A restaurant server got a massive tip and it is helping to change her life - KFOR Oklahoma City

Cloud Hopper Attacks Far More Extensive than First Thought – CloudWedge

Home > News > Cloud Hopper Attacks Far More Extensive than First Thought

Chinese hacker group APT10 has been plundering the cloud installations for dozens of businesses for over three years, but a news report by Reuters made their actions public knowledge. Now, further digging into the scandal has revealed that the groups impact was far more extensive than initially suspected. Several major cloud providers have fallen prey to the group. However, many companies have failed to inform their clients that they may be the victims of this particular hack. Providers, hoping to protect their reputation, had simply told their clientele that the issue was dealt with when it wasnt.

A report issued by the Wall Street Journalon the 30th of December, 2019 notes that at least a dozen cloud providers werecaught in the breach, including massive brands like IBM and Canadas CGI Group.Managed service providers are the ideal target for these hackers since oncethey breach the initial security, they have access to any of the data that thecompanies which use the service have stored on the server.

The WSJ report comes on the heels of aReuters scoop last year, which initially broke the news about APT10 and CloudHopper. The newest findings mention that over 10,000 records of US NavyPersonnel were taken. The impact on company reputations has made it difficultfor service providers to disclose details about the attack. However, the lackof knowledge about the events makes it even more difficult for cybersecurityfirms and departments to work out what happened. The UKs National CyberSecurity Centre issued warnings to companies that they should be extremely waryof cloud providers that are unwilling to share information about securitybreaches.

Over the last year since the story broke,APT10 has gone mostly silent. The US Justice Department has arrested twoindividuals it thinks took an active part in the campaign. However, certainsecurity companies still report software within the cloud pinging known APT10IPs, making it likely that the group is still operational in some way.

See the rest here:
Cloud Hopper Attacks Far More Extensive than First Thought - CloudWedge