Category Archives: Cloud Servers

GOP Data Firm Accidentally Leaks Personal Details of Nearly 200 Million American Voters – Gizmodo

Political data gathered on more than 198 million US citizens was exposed this month after a marketing firm contracted by the Republican National Committee stored internal documents on a publicly accessible Amazon server.

The data leak contains a wealth of personal information on roughly 61 percent of the US population. Along with home addresses, birthdates, and phone numbers, the records include advanced sentiment analyses used by political groups to predict where individual voters fall on hot-button issues such as gun ownership, stem cell research, and the right to abortion, as well as suspected religious affiliation and ethnicity. The data was amassed from a variety of sourcesfrom the banned subreddit r/fatpeoplehate to American Crossroads, the super PAC co-founded by former White House strategist Karl Rove.

Deep Root Analytics, a conservative data firm that identifies audiences for political ads, confirmed ownership of the data to Gizmodo on Friday.

UpGuard cyber risk analyst Chris Vickery discovered Deep Roots data online last week. More than a terabyte was stored on the cloud server without the protection of a password and could be accessed by anyone who found the URL. Many of the files did not originate at Deep Root, but are instead the aggregate of outside data firms and Republican super PACs, shedding light onto the increasingly advanced data ecosystem that helped propel President Donald Trumps slim margins in key swing states.

Although files possessed by Deep Root would be typical in any campaign, Republican or Democratic, experts say its exposure in a single open database raises significant privacy concerns. This is valuable for people who have nefarious purposes, Joseph Lorenzo Hall, the chief technologist at the Center for Democracy and Technology, said of the data.

The RNC paid Deep Root $983,000 last year, according to Federal Election Commission reports, but its server contained records from a variety of other conservative sources paid millions more, including The Data Trust (also known as GOP Data Trust), the Republican partys primary voter file provider. Data Trust received over $6.7 million from the RNC during the 2016 cycle, according to OpenSecrets.org, and its president, Johnny DeStefano, now serves as Trumps director of presidential personnel.

The Koch brothers political group Americans for Prosperity, which had a data-swapping agreement with Data Trust during the 2016 election cycle, contributed heavily to the exposed files, as did the market research firm TargetPoint, whose co-founder previously served as director of Mitt Romneys strategy team. (The Koch brothers also subsidized a data company known as i360, which began exchanging voter files with Data Trust in 2014.) Furthermore, the files provided by Roves American Crossroads contain strategic voter data used to target, among others, disaffected Democrats and undecideds in Nevada, New Hampshire, Ohio, and other key battleground states.

Deep Root further obtained hundreds of files (at least) from The Kantar Group, a leading media and market research company with offices in New York, Beijing, Moscow, and more than a hundred other cities on six continents. Each file offers rich details about political adsestimated cost, audience demographics, reach, and moreby and about figures and groups spanning the political spectrum. There are files on the Democratic Senatorial Campaign Committee, Planned Parenthood, and the American Civil Liberties Union, as well as files on every 2016 presidential candidate, Republicans included.

Whats more, the Kantar files each contain video links to related political ads stored on Kantars servers.

Spreadsheets acquired from TargetPoint, which partnered with Deep Root and GOP Data Trust during the 2016 election, include the home addresses, birthdates, and party affiliations of nearly 200 million registered voters in the 2008 and 2012 presidential elections, as well as some 2016 voters. TargetPoints data seeks to resolve questions about where individual voters stand on dozens of political issues. For example: Is the voter eco-friendly? Do they favor lowering taxes? Do they believe the Democrats should stand up to Trump? Do they agree with Trumps America First economic stance? Pharmaceutical companies do great damage: Agree or Disagree?

The details of voters likely preferences for issues like stem cell research and gun control were likely drawn from a variety of sources according to a Democratic strategist who spoke with Gizmodo.

Data like that would be a combination of polling data, real world data from door-knocking and phone-calling and other canvassing activities, coupled with modeling using the data we already have to extrapolate what the voters we dont know about would think, the strategist said. The campaigns that do it right combine all the available data together to make the most robust model for every single voter in the target universe.

In a statement, Deep Root founder Alex Lundry told Gizmodo, We take full responsibility for this situation. He said the data included proprietary information as well as publicly available voter data provided by state government officials. Since this event has come to our attention, we have updated the access settings and put protocols in place to prevent further access, Lundry said.

Deep Roots data was exposed after the company updated its security settings on June 1, Lundry said. Deep Root has retained Stroz Friedberg, a cybersecurity and digital forensics firm, to investigate. Based on the information we have gathered thus far, we do not believe that our systems have been hacked, Lundry added.

So far, Deep Root doesnt believe its proprietary data was accessed by any malicious third parties during the 12 days that the data was exposed on the open web.

Deep Roots server was discovered by UpGuards Vickery on the night of June 12 as he was searching for data publicly accessible on Amazons cloud service. He used the same process last month to detect sensitive files tied to a US Defense Department project and exposed by an employee of a top defense contractor.

This is not the first leak of voter files uncovered by Vickery, who told Gizmodo that he was alarmed over how the data was apparently being usedsome states, for instance, prohibit the commercial use of voter records. Moreover, it was not immediately clear to whom the data belonged. It was decided that law enforcement should be contacted before attempting any contact with the entity responsible, said Vickery, who reported that the server was secured two days later on June 14.

Deep Roots data sheds light onto the increasingly sophisticated data operation that has fed recent Republican campaigns and lays bare the intricate network of political organizations, PACs, and analysis firms that trade in bulk voter data. In an email to Gizmodo, Deep Root said that its voter models are used to enhance the understanding of TV viewership for political ad buyers. The data accessed was not built for or used by any specific client, Lundry said. It is our proprietary analysis to help inform local television ad buying.

However, the presence of data on the server from several political organizations, including TargetPoint and Data Trust, suggests that it was used for Republican political campaigns. Deep Root also works primarily with GOP customers (although similar vendors, such as NationBuilder, service the Democrats as well).

Deep Root is one of three data firms hired by the Republican National Committee in the run-up to the 2016 presidential election. Founded by Lundry, a data scientist on the Jeb Bush and Mitt Romney campaigns, the firm was one of three analytics teams that worked on the Trump campaign following the partys national convention in the summer of 2016.

Lundrys work brought him into Trumps campaign war room, according to a post-election AdAge article that charted the GOPs 2016 data efforts. Deep Root was hand-picked by the RNCs then-chief of staff, Katie Walsh, in September of last year and joined two other data shopsTargetPoint Consulting and Causeway Solutionsin the effort to win Trump the presidency.

Walsh, who now works for the nonprofit America First Policies after a brief stint in the White House, oversaw Trumps data operation in partnership with Brad Parscale, Trumps digital director. (Parscale did not respond to a request for comment before press time. Attempts to reach Walsh for comment were also unsuccessful.) Walsh and Parscale focused their efforts on three categories of voters, AdAge reports: voters who might be predisposed to support Trump, Republican voters who were uncertain about Trump, and voters that were leaning toward Hillary Clinton but could be persuaded by Trumps message of changing up government-as-usual.

To appeal to the three crucial categories, it appears that Trumps team relied on voter data provided by Data Trust. Complete voter rolls for 2008 and 2012, as well as partial 2016 voter rolls for Florida and Ohio, apparently compiled by Data Trust are contained in the dataset exposed by Deep Root.

Data Trust acquires voter rolls from state officials and then standardizes the voter data to create a clean, manageable record of all registered US voters, a source familiar with the firms operations told Gizmodo. Voter data itself is public record and therefore not particularly sensitive, the source added, but the tools Data Trust uses to standardize that data are considered proprietary. That data is then provided to political clients, including analytics firms like Deep Root. While Data Trust requires its clients to protect the data, it has to take clients at their word that industry-standard encryption and security protocols are in place.

TargetPoint and Causeway, the two firms employed by the RNC in addition to Deep Root, apparently layered their own analytics atop the information provided by Data Trust. TargetPoint conducted thousands of surveys per week in 22 states, according to AdAge, gauging voter sentiment on a variety of topics. While Causeway helped manage the data, Deep Root used it to perfect its TV advertising targetsproducing voter turnout estimates by county and using that intelligence to target its ad buys.

A source with years of experience working on political campaign data operations told Gizmodo that the data exposed by Deep Root appeared to be customized for the RNC and had apparently been used to create models for turnout and voter preferences. Metadata in the files suggested that the database wasnt Deep Roots working copy, but rather a post-election version of its data, the source said, adding that it was somewhat surprising the files hadnt been discarded.

Because the data from the 2008 and 2012 elections is outdatedthe source compared it to the kind of address and phone data one could find on a lousy internet lookup siteits not very valuable. Even the 2016 data is quickly becoming stale. This is a proprietary dataset based on a mix of public records, data from commercial providers, and a variety of predictive models of uncertain provenance and quality, the source said, adding: Undoubtedly it took millions of dollars to produce.

Although basic voter information is public record, Deep Roots dataset contains a swirl of proprietary information from the RNCs data firms. Many of filenames indicate they potentially contain market research on Democratic candidates and the independent expenditure committees that support them. (Up to two terabytes of data contained on the server was protected by permission settings.)

One exposed folder is labeled Exxon-Mobile [sic] and contains spreadsheets apparently used to predict which voters support the oil and gas industry. Divided by state, the files include the voters names and addresses, along with a unique RNC identification number assigned to every US citizen registered to vote. Each row indicates where voters likely fall on issues of interest to ExxonMobil, the countrys biggest natural gas producer.

The data evaluates, for example, whether or not a specific voter believes drilling for fossil fuels is vital to US security. It also predicts if the voter thinks the US should be moving away from fossil-fuel use. The ExxonMobil national score document alone contains data on 182,746,897 Americans spread across 19 fields.

Some of the data included in Deep Roots dataset veers into downright bizarre territory. A folder titled simply reddit houses 170 GBs of data apparently scraped from several subreddits, including the controversial r/fatpeoplehate that was home to a community of people who posted pictures of people and mocked them for their weight before it was banned from Reddits platform in 2015. Other subreddits that appear to have been scraped by Deep Root or a partner organization focused on more benign topics, like mountain biking and the Spanish language.

The Reddit data couldve been used as training data for an artificial intelligence algorithm focused on natural language processing, or it might have been harvested as part of an effort to match up Reddit users with their voter registration records. During the 2012 election cycle, Barack Obamas campaign data team relied on information gleaned from Facebook profiles and matched profiles to voter records.

During the 2016 election season, Reddit played host to a legion of Trump supporters who gathered in subreddits like r/The_Donald to comb through leaked Democratic National Committee emails and craft pro-Trump memes. Trump himself participated in an Ask Me Anything session on r/The_Donald during his campaign.

Given how active some Trump supporters are on Redditr/The_Donald currently boasts more than 430,000 membersit makes sense that Trumps data team might be interested in analyzing data from the site.

A FiveThirtyEight analysis that looked at where r/The_Donald members spend their time when theyre not talking politics might shed some light onto why Deep Root collected r/fatpeoplehate data. FiveThirtyEight found that, when Redditors werent commenting in political subreddits, they most often frequented r/fatpeoplehate.

Its possible that Deep Root intended to use data from r/fatpeoplehate to build a more comprehensive profile of Trump voters. (Lundry declined to comment beyond his initial statement on any of information included in the Deep Root dataset.)

However, FiveThirtyEights investigation doesnt account for Deep Roots collection of data from mountain-biking and Spanish-speaking subreddits that werent as popular with r/The_Donald membersand data from these subreddits that are not so closely linked to Trumps diehard supporters might be more useful for his campaigns goal of pursuing swing voters.

My guess is that they were scraping Reddit posts to match to the voter file as another input for individual modeling, a source familiar with campaign data operations told Gizmodo. Given the number of random forums, my guess is they started with a list of accounts to scrape from, rather than scraping from all forums then trying to match from there (in which case youd start with the political ones).

Matching voter records with Reddit usernames would be complicated and any large-scale effort would likely result in many inaccuracies, the source said. However, campaigns have attempted to match voter files with social media profiles in the past. Such an effort by Deep Root wouldnt be entirely surprising, and would likely yield rich data on the small portion of users it was able to match with their voter profiles, the source explained.

The Deep Root incident represents the largest known leak of Americans voter records, outstripping past exposures by several million records. Five voter-file leaks over the past 18 months exposed between 350,000 and 191 million files, some of which paired voter dataname, race, gender, birthdate, address, phone number, party affiliation, etc.with email accounts, social media profiles, and records of gun ownership.

Campaigns and the data analysis firms they employ are a particularly weak point for data exposure, security experts say. Corporations that dont properly secure customer data can face significant financial repercussionsjust ask Target or Yahoo. But because campaigns are short-term operations, theres not much incentive for them to take data security seriously, and valuable data is often left out to rust after an election.

Campaigns are very narrowly focused. They are shoestring operations, even presidential campaigns. So they dont think of this as an asset they need to protect, the Center for Democracy and Technologys Hall told Gizmodo.

Even though voter rolls are public record and are easy to accessOhio, for instance, makes its voter rolls available to download onlinetheir exposure can still be harmful.

Voter registration records include ZIP codes, birthdates, and other personal information that have been crucial in research efforts to re-identify anonymous medical data. Latanya Sweeney, a professor of government and technology at Harvard University, famously used voter data to re-identify Massachusetts Governor William Weld from information in anonymous hospital discharge records.

Because of the personal information they contain, voter registration databases can also be useful in identity theft schemes.

Even though exposure of Deep Roots data has the potential to harm voters, its exactly the kind of data that campaigns lust after and will spend millions of dollars to obtain. Campaigns are motivated to accumulate as much deeply personal information about voters as possible, so they can spend their ad dollars in the right swing districts where theyre likely to sway the greatest number of voters. But voter data rapidly goes stale and campaigns close up shop quickly, so data is seen as disposable and often isnt well-protected.

I can think of no avenues for punishing political data breaches or otherwise properly aligning the incentives. I worry that if theres no way to punish campaigns for leaking this stuff, its going to continue to happen until something bad happens, Hall said. The data left behind by campaigns can pose a lingering security issue, he added. None of these motherfuckers were ever Boy Scouts or Girl Scouts, they dont pack out what they pack in.

[UpGuard]

Here is the original post:
GOP Data Firm Accidentally Leaks Personal Details of Nearly 200 Million American Voters - Gizmodo

IBM appears to have excess cloud servers to shift at low, low, prices – The Register

Cloud computing prices come down regularly, but IBM's just offered a price cut of a sort The Register hasn't seen before a temporary discount on bare metal servers running just one CPU family.

That CPU family is the Xeon E5-26XX v3 range, which Intel introduced from Q3 2014 through to mid-2015. In other words, old-ish, slow-ish stuff, certainly compared to the Xeon Scalable Family arriving any week now.

And perhaps rather unloved, too, as IBM's Bluemix cloud says you can have them at an unspecified discount, for a limited time only and only while current inventories are available.

Which rather suggests that IBM has quite a few unused servers that it needs to find workloads for, stat.

Why might that be? Perhaps a substantial client has upgraded to new servers or quit Bluemix altogether. Perhaps IBM's cloud is just not attracting customers. Or maybe IBM just got its forecasting wrong and bought too many servers.

Whatever the reason, it's a curious offer and one that highlights Bluemix's lack of a spot price server market like those offered by Azure and AWS.

Analyst firm Gartner's IaaS magic quadrant last week opined that IBM's cloud has not improved significantly since the IBM acquisition in mid-2013 [and is] missing many cloud IaaS capabilities required by midmarket and enterprise customers.

It appears it may also be missing customers willing to pay full-freight for Xeon E5-26XX v3 servers.

Visit link:
IBM appears to have excess cloud servers to shift at low, low, prices - The Register

Dell EMC Hybrid Cloud System for Microsoft Review (Azure Pack) – StorageReview.com

June 16th, 2017 by StorageReview Enterprise Lab

The Dell EMC Hybrid Cloud System for Microsoftdebuted in late 2015as the first validated hybrid cloud system that implemented Microsoft Cloud Platform System (CPS) Standard. The Dell EMC Hybrid Cloud System for Microsoft combines PowerEdge hardware, Dell EMC Networking, and a software stack built with Windows Azure Pack and System Center 2012 R2. Dell EMC's vast engineering resources focused on creating a turnkey experience for new Dell EMC hybrid cloud administrators with a unified interface and a variety of licensing schemes. This was the company's second collaboration with Microsoft on Azure-based cloud systems. Prior to CPS Standard, the companies worked together to offer Microsoft CPS Premium, targeted at much larger deployments. The Dell EMC Cloud for Microsoft Azure Stack is on deck as the company's next Azure-based offering, expected to be available later this year.

The Cloud, and peoples opinion of it, has gone through quite the evolution in the last few years. At first it was seen as a source of bulk cheap storage that wasnt safe. As time went on, the security concerns began to fade. Organizations werent just using the Cloud as a source of bulk storage or a replication target,they soon began to host several of their applications in the Cloud. Now there are hundreds of organizations that are either cloud-first (the company begins and remains mainly in the cloud) or cloud-centric (the company still has on-prem gear but runs a majority of its business through the cloud). Of the three types of clouds (private, public, and hybrid), the largest growing seems to be the hybrid version. Capitalizing on this, Dell EMC continues to workwith Microsoft to deliver an on-prem hybrid cloud for Microsoft shops. Additionally, Dell EMC feels they offer an incredible amount of value to customers deploying these large, and many times complex, solutions by finding bugs and sorting out patch issues well before a customer gets their hands on it. That way, when Microsoft's Patch Tuesday comes around, Dell EMC's Hybrid Cloud Team makes the update process painless for their end customer. Dell EMC also notes value from integrations with value-added services like backup and encryption, along with its one-call support for the complete hardware-software stack and ongoing life-cycle management.

Dell EMC provided us remote access to a Hybrid Cloud System for Microsoft that was hosted at the Dell Customer Solution Center in Austin, Texas. Dell EMC's Hybrid Cloud System is built on PowerEdge C6320 servers with Intel Xeon E5-2600 V3 processors that host up to 400 virtual machines. PowerEdge R730 servers provide file server functionality, while Dell EMCMD1400/1420 DAS arrays are configured with between 32TB and 128TB of raw storage space. For network connectivity, Dell EMC Networking S4048 10G switches are leveraged.

The DHCS comes with some configurability for users that want more or less of certain aspects. For example, the minimum configuration is one S4048 switch for networking, one PowerEdge C6320 server for compute, and a cluster of four PowerEdge R730 servers for storage. The minimum configuration comes with no backup Data Protection Manager (DPM) servers. On the flip side, if users need the maximum of everything, the DHCS can be configured with two S4048 switches for redundancy, four PowerEdge C6320 servers (16 sleds for compute), three backup DPM servers, and sixPowerEdge R730 servers (two storage hosts and four storage enclosures, all accessible to the compute nodes).

The overall value proposition of a hybrid cloud hinges on making it straightforward to granularly allocate resources to the localprivate cloud, as well as offsite-public and private-cloud hosts. An integrated hybrid cloud environment could succeed or fail based on whether it creates a consistent user and administrative experience across all of the resources it manages. At this point in the evolution of cloud services, it is also vital that cloud infrastructure integrates seamlessly with backup and disaster recovery services.

On the operating system and software end, Dell EMC's implementation is based on Windows Server 2012 R2 with System Center 2012 R2 and the Windows Azure Pack. Azure is the center of the user experience, as well as the interface for most administrative tasks. During our testing for the review we had access to Azure Backup, Azure Site Recovery, and Azure Operational Insights. Dell's PowerEdge management system includes OpenManage Integration for System Center and Dell iDRAC 7 with LifeCycle Controller.

Dell Hybrid Cloud System for Microsoft Specifications

Management

Our review focuses on the experience of using the Windows Azure Pack for management, although System Center 2012 R2 is also available for "traditional" Windows administration workflows. While we were working with Azure, we wanted to be sure to experience the process of deploying infrastructure as a service, database as a service, and Azure's disaster recovery functions.

The Azure "tenant portal" is the center of the administrative experience. After selecting Azure Pack tenant, one simply needs to log in and the userwill be able to easily provision a new VM.

Once logged in, users see everything that has been created by the user that logged in, includingVMs, gallery items, and databases. The left-hand side shows resource providers that are part of the plan. For new deployments, users need to click on the +New tab in the bottom-left corner.

After choosing new, a pop-up screen comes up with a variety of options. Here we select "Standalone Virtual Machine" from the options listed.

After selecting Standalone Virtual Machine, we are brought to the Standalone Virtual Machine gallery. Given a variety of choices, we are going with A2_Full. To the right of the selection is info about the VM.

Once users select the VM they want, they will be prompted to enter a name, their username and password, as well as a product key. After that, users need to indicate where the VM will be deployed. Once this is complete, users need only to click the checkmark in the bottom right-hand corner to finish the deployment.

Next, we are looking at how to provision a Database as a Service (DBaaS). Going back to the Service Management portal, users simply choose MySQL Database on this go round.

Once the Database is selected, users are prompted with a window asking to name the database and the edition. This is followed by a second window asking for credentials.

After the database is created, users again go back to the Service Management Portal. The DB1 database created a moment ago with some general information appears. For the next step, we will need the server name,which can be found under the info tab at the bottom. Once we have that, we click on +New to provision the VM role.

On the top of the left-hand side is the option for creating the VM Role.

Here wewill be creating a WordPress instance to the database and then we will scale it out to multiple instances. Here we select WordPressExtDB and hit the arrow on the bottom right. The wizard will prompt us to name the WordPress Database (WP01 in this case) and the version. The wizard will continue to step through actions such as creating compute name pattern, time zone, root account credentials, DNS domain name, and SSH key.

The last step uses the information from the previous steps to finish setting up the WordPress instance. Hitting the checkmark will deploy it.

Once the WordPress instance is deployed, users can set up an account or create a web site on the web front end of WordPress. AWordPress account needs to be set up before the next step. Once the account has been set up, users can go back to the tenant portal of Azure and click on the WordPress instance to define its role.

Within the role of the WordPress instance, simply select "Scale" and move the slider to the desired amount of instance needed.

Finally, we will be configuring Azure as a recovery point for disaster recovery. The DHCS comes configured to use Azure Site Recover, which means fewersteps for users. This feature is easy enough;as opposed to showing a step-by-step, it can simply be summed up in a few sentences. Users need to subscribe to a plan or add-on that has VM protection enabled. Users then create a virtual network as to how the VM will connect for failover. And then the user creates a VM for the failover.

Azure Operational Insights aggregates log data across platforms, operating systems, and coulds to provide enterprise-wide analysis.

The Dell EMC Hybrid Cloud Team feels their biggest value-add is simplifying the process for users to make updates and patches. An automated patch and update system for firmware, BIOS, drivers, and software is offered that is designed to be non-disruptive. The update framework includes intelligent dependency analysis that tests and packages patches and updates before deployment. This makes sure users spend their time managing their own needs, versus spending time makingsure updates don't break existing functionality. Moreover, Dell EMC says that it is typical to have the hybrid cloud system operational in less than three hours.

Conclusion

The Azure-centric services deployment model is a change from traditional Microsoft server administration, but one that felt intuitive and polished during our time working with the Dell EMC Hybrid Cloud System. Intuitive user experience is growing more important as the number of applications and services required for business continues to increase in most sectors.

As a cloud-native management environment, Azure has been built from the ground up with the expectation that users and administrators may need to be able to provision services and storage with granular control over whether data and compute will be hosted in the private or public cloud. Azure's backup and disaster recovery functionality provides the means to implement a variety of common configurations in both regards, while at the same time simplifying the process to do so.

Combining years of Microsoft collaboration with the EMC heritage of turnkey hybrid cloud systems (Enterprise Hybrid Cloud and Native Hybrid Cloud), Dell EMC is now focusing its Azure-based solutions efforts on the forthcoming Dell EMC Cloud for Microsoft Azure Stack. Announced just ahead of Dell EMC World 2017 in May, this solution will combine all of Dell EMCs past experience into a similar, yet new Microsoft-based hybrid cloud offering. Taking many of these concepts further, Dell EMC notes that its Azure Stack-based hybrid cloud system will move the experience from a disaggregated storage model (from DHCS) to a true hyper-converged model (APIs will also allow users to write once and run applications on any Azure cloud). Dell EMC says its long history with Microsoft, combined with its turnkey hybrid cloud platforms experience will give them a leg up as Azure Stack hits the market later this year. We look forward to the release of the Dell EMC Cloud for Microsoft Azure Stack and the opportunity to conduct a closer examination and review.

Dell EMC Hybrid Cloud

Discuss This Review

See more here:
Dell EMC Hybrid Cloud System for Microsoft Review (Azure Pack) - StorageReview.com

Ensuring Cloud Security – CIOReview

Cloud technology has taken over almost all the organizations networking arena, making an incredible influence in enabling mobility, security, and data safety. Even though the modern cloud security leverages top level encryption techniques, complete security is still a matter of debate. Today, the Cloud computing security techniques encompasses of various security policies as well as control-based technologies designed to adhere to regulatory compliance rules and protect data applications, information, and infrastructure associated with cloud.

Cloud computing security solutions are gaining a faster pace, providing many of the same functionalities as traditional IT security that includes protecting critical information from theft, data leakage and deletion. For a better understanding, take a look at some of the top cloud security concerns existing in the arena today.

Cloud software interfaces are not always trust worthy. The Cloud Security Alliance warns the users to be aware while using cloud interfaces, especially third party interfaces that interact with the cloud services. Depending on a weak set of interfaces and APIs exposes organizations to a variety of security issues related to the information confidentiality and integrity. In addition, it recommends learning the techniques that the cloud service provider uses to integrate security through its services from activity monitoring to user authentication and access control.

Encrypting the data when it is on the providers server, as well as when using the data stored in the cloud server are other important aspects that have to be considered in maintaining security. Various reports suggest that only a few cloud providers assure protection for data being used within the application. While selecting a service, it is important to ask potential cloud providers how they secure the user data not only when its in transit but also when its on their servers and accessed by the cloud-based applications. In addition, make sure that the service provider securely disposes all the data when not required.

All the data that travels within the network or between the organizations network and the cloud server must be encrypted. Especially, the data traffic between cloud and the organization should be passing through a secure channel for better safety. Make sure that when connecting to the provider the URL begins with HTTPSHypertext Transfer Protocol Secure the secure version of HTTP. Moreover, all the data should be authenticated and encrypted using standard protocols such as Internet Protocol Security (IPsec).

As in organizational environments, there are several users accessing the data stored in cloud, there should be personal access control techniques for ensuring safety. All the uses should be allowed access to only the data that the user needs and has to be categorized. Access should only be allowed based on user designations or the data sensitivity.

Cloud computing offers small to large businesses numerous benefits with data storage, and analytics. The CIOs should address cloud security issues with the service provider before entrusting data to the server and applications. For the best results, make sure that these security challenges are thwarted even from the first time the organization is connected to cloud.

Here is the original post:
Ensuring Cloud Security - CIOReview

Tech Data Launches Small Business Cloud Server On Microsoft Azure – ECM Connection (press release)

All-in-one service designed to meet the needs of SMBs with platform for building cloud services revenues

Clearwater, FL /PRNewswire/ - Tech Data Corporation (Nasdaq: TECD) today announced the launch of a cloud-based solution designed for small to medium-sized businesses (SMBs) on Microsoft Azure. The new Small Business Cloud Server has been developed to meet the growing demand from SMB customers for a scalable and flexible IT platform that is cost-effective and easy to deploy and manage. The solution is available to partners across the distributor's entire global footprint through its Tech Data Cloud business.

"Small and medium-sized businesses are now starting to embrace the cloud with real enthusiasm," says Michael Urban, corporate vice president, Corporate Strategy and Business Transformation and responsible for Tech Data's global cloud strategy. "Many users have already migrated to Microsoft Office 365, and there is a massive opportunity now for resellers to move small business servers onto the Microsoft Azure platform.

"The Tech Data Small Business Cloud Server makes it easy for resellers to help them take that next stepand to take the burden of applications and security management off their small business customers as well. It is an ideal base upon which small businesses can start to construct their own efficient and cost-effective cloud ecosystem, and for SMB resellers to use as a platform on which they can build their cloud services revenues."

The service is available in three sizes: 1-6 users, 7-20 users, and 40+ users. Detailed monthly usage and billing information will be provided to resellers via StreamOne, and pre-determined usage thresholds and limits can be set if required. In addition to a Microsoft Azure virtual machine, the Tech Data Small Business Cloud Server comes with Microsoft Office 365 (Exchange, Office and SharePoint) software and provides backup, storage and VPN functionality. Remote desktop is optional for remote access.

Alyssa Fitzpatrick, general manager, SMB Sales for Microsoft Corp., stated: "Microsoft applauds Tech Data's initiative and the investment in setting up its Small Business Cloud Server on Microsoft Azure. It provides SMB customers an option when looking for a flexible and cost-effective cloud-based platform for their IT solutions, and especially for those deploying Microsoft Office 365, Dynamics 365 and Microsoft Enterprise Mobility + Security solutions."

A Small Business Cloud Server can be set up in minutes using StreamOne, Tech Data's cloud provisioning and billing platform. Users can easily be added and removed, while servers can be deployed and managed remotely, enabling resellers to offer swift and inexpensive set-up services.

"Microsoft Azure gives specialized businesses like ours a new world of options when it comes to truly optimizing our back-office environmentno more one-size-fits-all boxes," said Mike Farlow, president at ComTech Network Solutions, an SMB solution provider and Tech Data partner. "Small Business Cloud Server is a great option for SMBs as it provides the flexibility that we've not seen, and it allows us to grow as our business warrants, without the traditional limitations of resources."

"The Small Business Cloud Server solution from Tech Data is very easy to deploy and manage; also, the resources provided by Tech Data were simple to understand and gave us the necessary knowledge and confidence to be successful in this space," said Jim Hunton, owner of Onsite Technical Services, an IT solution provider and Tech Data partner. "We plan on also showcasing this solution to our customers, as it will enable us to provide more options and advise on which solution best helps solve for their business and IT needs."

Tech Data's approach to the cloud helps partners excel through specialization while capitalizing on the high-growth, next-generation technologies that are redefining the future. In addition to the cloud, partners can leverage in-depth expertise in complementary technology market segments, including cognitive computing, the data center, data analytics, the Internet of Things (IoT), mobility, security and enterprise networking, and training and education.

For more information on Tech Data Cloud in the Americas, contact tdcloud@techdata.com or call (800) 237-8931. In Europe, contact cloud@techdata.eu.

Click to tweet: .@Tech_Data Launches Small Business #Cloud Server on @Microsoft #Azure http://ow.ly/qXsJ30cz0SA.

About Tech Data Tech Data Corporation is one of the world's largest wholesale distributors of technology products, services and solutions. Its advanced logistics capabilities and value added services enable 115,000 resellers to efficiently and cost effectively support the diverse technology needs of end users in more than 100 countries. Tech Data generated $26.2 billion in net sales for the fiscal year ended January 31, 2017. It is ranked No. 107 on the Fortune 500 and one of Fortune's "World's Most Admired Companies." To learn more, visit http://www.techdata.com, or follow us on Facebook and Twitter.

Read more from the original source:
Tech Data Launches Small Business Cloud Server On Microsoft Azure - ECM Connection (press release)

Tech Data Unveils Cloud Products, Services for Microsoft Azure, AWS – Channel Partners

**Editors Note: Click here for our recently compiled list of new products and services.**

Its a busy week at global distributor Tech Data, with several cloud-related announcements, two related to Microsoft, and another about Amazon Web Services (AWS).

The new Agent365 service, introduced Thursday, supports Microsoft Office 365 and other Microsoft cloud applications.It provides resellers with operational systems and resources, including billing, provisioning, management and help desk.

Immediately available in the U.S., the Agent365 enables resellers to transition from the retired Microsoft Advisor program while keeping the commissions coming in and preserving outsourced billing. They also get toremain the partner of record for their customers. Tech Data will offer a 24/7 helpdesk for its reseller partners to support their end customers.

Tech Data outlines some of the programs benefits:

On Wednesday, Tech Datas Technology Solutions business, formerly a division of Avnet, announced the Technology Solutions AWS Reserved Instance (RI) Optimization Managed Service, an enhancement to its AWS offering available through via its Cloud Marketplace, the StreamOne Enterprise Solutions Store.

The newservice is fee-based and isdesigned to address RI optimization and efficiency. Some features include:

Tech Data also added Marketplace Premium to the Cloud Marketplace.It includes:brand-ready storefronts with white-label logos; custom URL and basic colors; self sign-up to various cloud provider programs; easy access to existing Cloud Marketplace solutions and services; the ability to customize products and solutions; and more.

Technology Solutions Cloud Marketplace (formerly Avnet) and Tech Datas StreamOne cloud platform are available to all of Tech Datas partners.

Also announced this week is the new Small Business Cloud Server, a cloud-based solution for small and medium-sized businesses (SMBs) on Microsoft Azure a service thats available in three sizes:one to sixusers,seven to 20users, and 40+ users.

In addition to a Microsoft Azure virtual machine, the Tech Data Small Business Cloud Server comes with Microsoft Office 365 (Exchange, Office and SharePoint) software and provides backup, storage and VPN functionality. Remote desktop is optional for remote access.

Small and medium-size businesses are now starting to embrace the cloud with real enthusiasm, said Michael Urban, corporate vice president, corporate strategy and business transformation, and responsible for Tech Datas global cloud strategy. Many users have already migrated to Microsoft Office 365 and there is a massive opportunity now for resellers to move small business servers onto the Microsoft Azure platform.

See more here:
Tech Data Unveils Cloud Products, Services for Microsoft Azure, AWS - Channel Partners

Student records unintentionally made public on OU mail servers – Norman Transcript

NORMAN Private and sensitive information about past and present University of Oklahoma students was available to anyone with a campus-issued email account.

Student records with details as sensitive as financial aid information, Social Security numbers and eligibility status could be accessed through a document sharing system linked to campus emails over the course of a month. It was first discovered by OUs student-run newspaper, The Oklahoma Daily, which notified university administration and ran a story Wednesday about the type of records available.

Upon learning of the security issue much of the data available was protected by the Family Educational Rights and Privacy Act (FERPA) OU shut down the Microsoft file sharing program Delve, which was available to students through the campus Microsoft Office 365 software. According to The Dailys report, users were able to search for documents and records with information about other students.

At no point was the security of OU IT systems breached,saidMatt Hamilton, registrar and vice president for enrollment and student financial services. Rather, some sensitive files were inadvertently made accessible to OU account holders due to a misunderstanding of privacy settings.

In his statement, Hamilton contends no unauthorized person other than the author of the report accessed any of the files mentioned in the OU Daily Story.

Microsoft Delve works with another program, SharePoint, to allow users to share and access documents. Users place documents in SharePoint; Delve enables them to search for those documents.

Any SharePoint site with the open privacy setting was searchable to any user within the OU system, Hamilton said. This is how The Daily was able to access the sensitive data in question.

In its story, The Daily notes that any data gathered for the purposes of the story was deleted once the story was published.

It also states no stories will be written based on any records found.

The records available, according to The Daily, ranged from scholarship money students received to Social Security numbers, academic performance and eligibility of student athletes based on drug test results, academic performance and recruiting violations.

The records were made available when the university moved SharePoint to cloud servers May 14, OU spokesperson Rowdy Gilbert said.

Hamilton said some OU departments used the program to share files with each other, which is legal under FERPA.

However, in some cases, the privacy setting options of these sites were misinterpreted, inadvertently allowing access to any OU account holder, Hamilton said.

Delve remains shut down to any OU user. The SharePoint sites mentioned in The Dailys story have now had access restricted to authorized staff users only, Hamilton said.

While there was no outside breach of our files, we understand and acknowledge concerns about the vulnerability of sensitive data, he said. We rectified the situation immediately and can assure students that their FERPA-protected files are secure. Moving forward, we will continue to evaluate our privacy measures to ensure absolute protection of personal data.

Gilbert said since OU faculty and staff handle sensitive information daily, there are strict guidelines and expectations they are required to uphold.

We have no evidence that this expectation has been violated, Gilbert said.

Students reacted to the report with concern. While there is no sense the information was made available on purpose, there is a worry the records were so widely available at all.

I dont think the university was using the files and information for anything negative, but its an issue that anyone, not just school employees, could look at or use that information,saidDan Williams, a junior studying political science. I think taking it down is a great response, but I think they need to be constantly monitoring data inside of OU.

We switch platforms all the time, and any time you make these changes, you have to make sure the data is safe. Its possible that private information is out in the public and we dont know about it, and that is very concerning.

According to The Daily's findings, the records that were available include:

29,000-plus cases of protected data disclosed

18,668 financial aid records of freshman classes from 2012-2016

4,585 Pell Grant recipients

626 semester GPAs for student athletes and managers

539 visa types and statuses for international students

133 semester GPAs for students on President Leadership Council

30 Social Security numbers

Read more:
Student records unintentionally made public on OU mail servers - Norman Transcript

IBM and HPE’s Server Businesses Aren’t Just Pressured By the Cloud Anymore – TheStreet.com

Following a long string of quarters in which the server sales of enterprise IT firms such as IBM Corp. (IBM) , HP Enterprise Co. (HPE) , Lenovo and Cisco Systems Inc. (CSCO) have felt the ill effects of public cloud infrastructure adoption, it's generally well-understood how the preference of companies likes Amazon.com Inc. (AMZN) , Alphabet Inc. (GOOGL) , Facebook Inc. (FB) and (increasingly) Microsoft Corp. (MSFT) to design their own servers and have them supplied by Asian contract manufacturers (ODMs) has become a headwind for the old guard. Especially as more and more business software workloads are either migrated to a public cloud, or built from the start to be run on one.

What might not be as well-appreciated yet, and which is driven home by some estimates and sales figures released this week, is how the IT giants are also now pressured by the aggressive efforts of one of their peers to take share. Namely, Dell Technologies Inc. (DVMT) , which is making full use of the expanded resources it has as its disposal after merging with storage giant EMC last year.

Alphabet, Cisco and Facebook are holdings in Jim Cramer's Action Alerts PLUS Charitable Trust Portfolio. Want to be alerted before Cramer buys or sells GOOGL, CSCO and FB? Learn more now.

On Tuesday, June 6, IDC estimated global server revenue fell 4.6% annually in Q1 to $11.8 billion. The next day, Gartner estimated Q1 server sales fell 4.5% to $12.5 billion (a slightly different methodology appears to be responsible for a higher revenue estimate). Neither number was all that surprising, given the firms respectively estimated 4.6% and 1.9% declines for Q4.

It also wasn't too surprising that sales of servers designed by cloud giants and supplied by ODMs grew strongly following a Q4 lull, as the likes of Amazon and Facebook continued spending heavily on capex. IDC estimated sales of such servers, which it refers to as ODM Direct, grew 41.8% to $1.2 billion (10.4% of industry revenue). It added one unnamed cloud firm single-handedly accounted for over 10% of the 2.21 million servers shipped during the quarter.

What was, surprising, though is that both firms reported Dell, the world's second-biggest server vendor, saw meaningful sales growth in spite of the headwinds faced by peers. IDC estimated Dell's server sales grew 4.7% to $2.37 billion, leading its market share to rise to 20.1% from 18.3% a year ago. By contrast, the firm had estimated Dell's server sales were roughly flat in Q4. Gartner gave Dell a 19% Q1 share, up from 17.3%.

Dell confirmed its share gains on June 8 when the company reported its server and networking revenue grew 5% in the April quarter to $3.2 billion. The company added sales of its mainstay PowerEdge enterprise servers, which run on Intel Corp. (INTC) and (to a lesser degree) AMD Inc.'s (AMD) x86 CPUs, grew by double digits. That offset lower "high-volume cloud" server sales.

IDC thinks HPE's Q1 server share fell to 24.2% from 27.5% (still good for first place), with revenue dropping 15.8% to $2.86 billion. The company announced last week its server revenue fell 14% in its April quarter. Like Dell, HPE's cloud sales have been falling, as a major client (believed to be Microsoft) relies more on internal and open-source designs. Both Dell and HPE have also noted their cloud server sales tend to carry lower margins; the latter suggested on its earnings call it might pare its investments in this space to focus on higher-margin opportunities.

IBM had an even tougher time in Q1. IDC thinks Big Blue's server sales, hurt not only by cloud adoption and Dell but also by mainframe cyclicality and share losses for its Power server line relative to x86 servers, fell 34.7% to $745 million, with its share dropping to 6.3% to 9.2%.

That allowed Cisco to grab third place from IBM. IDC estimates the networking giant's server share grew fractionally to 7%, in spite of a 3% decline in revenue to $825 million. Cisco has blamed recent pressures for the business on an industry shift towards rack servers relative to blade servers, while insisting it's making efforts to the ship.

Lenovo, which bought IBM's x86 server unit in 2014, rounded out the top-5. The Chinese tech giant was granted a 6.2% share, down from 7% a year ago, with revenue estimated to drop 16.5% to $727 million.

Overall, HPE, IBM, Cisco and Lenovo saw their server share fall 690 basis points in Q1 to 43.7%. That's easily worse than their performance in Q4, when IDC estimated their combined server share fell 490 basis points to 48.7%.

Clearly, Dell's rejuvenation is making a bad situation worse. Since closing the EMC merger last September, the company has been pitching enterprises on an end-to-end IT lineup that pairs Dell's servers and networking hardware with EMC's storage hardware and software, as well as the virtualization and infrastructure management software provided by EMC's VMware (VMW) unit. The company has also moved to integrate Dell and EMC's salesforces and reseller partner efforts. In addition, some Dell and EMC resellers have felt pressured to boost to their sales to remain one of the merged company's preferred partners.

All of these efforts are certainly paying off in the server market. Going forward, the launch of Dell's 14th-generation PowerEdge servers (announced in May) could provide a fresh boost. They promise improved hardware and software-based security features, better high-speed flash storage support and revamped management software tools and tech support services. This week, HPE countered by unveiling x86 servers promised to have an unmatched ability to protect a server's firmware, as well as greater support for "persistent memory" modules that combine DRAM with flash storage.

Along with Dell, HPE, Cisco and Lenovo will likely get a server sales boost from Intel's anticipated mid-summer launch of Xeon CPUs based on the company's Skylake architecture. IBM, which no longer sells x86 servers, won't be so lucky. But either way, the server businesses of all four companies have to contend with major challenges that a chip refresh can only provide temporary relief for.

More:
IBM and HPE's Server Businesses Aren't Just Pressured By the Cloud Anymore - TheStreet.com

How to Create a Laravel-Based LAMP Stack on Ubuntu – ProgrammableWeb

Laravel, has become the most used PHP framework for projects of all scopes (there are other PHP frameworks too like CodeIgniter, Symfony, Yii, and Zend). Whether you're working on a simple web app or a huge corporate portal, Laravel is up for the task. The robust framework is very versatile and is supported by a very passionate community of developers and users. Another good thing about Laravel is how easy installing and launching a Laravel project is in all development environments.

One exception to this rule is the installation of Laravel framework on cloud servers. In this tutorial, you'll go through the steps required to set up a Laravel-powered LAMP stack on Ubuntu. For the purpose of this tutorial, you'll be using a DigitalOcean-based cloud server running Ubuntu 16.04.2. DigitalOcean is a cloud infrastructure provider out of New York City. You don't need to use DigitalOcean, you just need access to an Ubuntu server, which could be located on your own network or in the cloud on one of the many services that compete against DigitalOcean (for example, Amazon).

Go to the DigitalOcean sign up page and sign up with your ID. An email will be send to your ID, and you should verify it and log in to your DigitalOcean account. Once you're logged in, go to Create a New Droplet. Choose your distribution, size and the data center of your server, as shown in the following GIF.

In order to install the Laravel PHP framework, you'll need access to a server's command line interface (CLI). The most common and convenient way of connecting to a Linux-based cloud server's CLI is with the Secure Shell (SSH) application. This shell offers a secure communication channel for connecting to and executing commands on the cloud server.

As an application, SSH comes pre-built into Linux and Mac OS X environments. If you want to access the server's CLI with Windows, download and use PuTTY. In this example, you'll use PuTTY. To connect to the cloud server, you must have the following credentials:

Fire up PuTTY and fill in the server IP address. Putty launches directly into this dialog box.

Click Open. You'll see a security alert notifying that you've not connected to this server before. If you're sure that you've got the IP address right, click Yes.

Next, you're prompted for your login credentials first thing in the terminal window. Insert the login credentials (username and password) for the server.

Note: You won't be able to see the password in the console screen.

Now, you're successfully connected to the server.

A LAMP stack is an integrated and interconnected setup of open source software. The setup comprises of Linux, Apache web server, and PHP. The M traditionally refers to MySQL, an open source relational database management system (RDBMS).

At least when it comes to Linux-based servers, the LAMP stack is for the most part what's minimally needed to run a website. For this reason, it's perhaps the most common solution for setting up servers for web development. An important reason for this universal choice is the cost factor all components of the LAMP stack are open source and, therefore, are free to use.

View original post here:
How to Create a Laravel-Based LAMP Stack on Ubuntu - ProgrammableWeb

The 3 Best Server Companies to Buy in 2017 – Madison.com

The global server market isn't the most flashy part of the technology industry, but it is one of the most important sectors in all of tech. Whetherphysical or cloud-based, servers are the backbone of modern computing. Without them, our tech-laden world could not exist.

That's why the companies that make servers can be steady, reliable investments. International Business Machines (NYSE: IBM), Hewlett-Packard Enterprise(NYSE: HPE), and Intel (NASDAQ: INTC)are three prominent names to consider in this space. Let's take a closer look at them.

The shift from physical servers to cloud-computing products is one of the biggest trends in the server market today. It's also the reality for IT giant IBM.

IBM saw its server sales decline 22% from $7.5 billion in 2015 to $5.9 billion in 2016. The company is in the third year of the sales cycle for its z13 mainframe server, and it emphasized that it continued to add server clients in 2016.However, in fiscal 2016 alone, cloud service revenue rose 49%, from roughly $3.9 billion to $5.9 billion,more than compensating for the revenue decline in physical servers.

Of course, selling cloud services involves increasing the number of its own servers in use. It's just that the physical servers are separated from the cloud-computing software, which changes the sales model but not the computing product being delivered. So IBM's revenue model will always be connected to the server market, even as the way it sells its computing power changes along with the rest of the industry in the years to come.

Image source: Getty Images

Servers are the largest revenue driver for Hewlett-Packard Enterprise, the largest server producer by market share. Their sales accounted for $14 billion of its $50.1 billion in total 2016 sales.

Hewlett-Packard Enterprise has so far weathered the changes in the server industry better than expected, as server sales for the IT giant fell just 0.7% in fiscal 2016. However, some of the long-term trends affecting the sector may be catching up with it, as server sales declined 14% year over year, from $3.5 billion to $2.9 billion, inthe most recent quarterly report.

The bigger problem is that the company lacks a dedicated cloud suite of products like IBM does. IBM can hope to hold on to many of its customer relationships by transitioning their solutions from physical servers to cloud computing as a service. Whether Hewlett-Packard Enterprise can figure out a similar solution will be the thing for investors to watch in the coming years.

The world's largest chipmaker rules over the server market; estimates of its share of server market microprocessors hover around 99%.So the company stands a great chance of continuing to print profits from the computing infrastructure space, even as the way computing is delivered changes.

That's great news, because Intel's data-center group, which represents the company's server sales, produced a 43% operating margin in fiscal 2016.This segment has, however, seen its operating margin decline from as high as 50% in 2014, a drop that Intel seems to pin on increased operating expenses.

If that's so, it implies that the company hasn't lost its top-line pricing power. That seems like a possibility, as leading cloud companies such asAmazon.com, Alphabet, and others could seek volume discounts for their server chips. Investors will want to keep an eye on this trend, but there's little doubt that Intel is one the most indispensable companies in the server market today.

10 stocks we like better than IBM

When investing geniuses David and Tom Gardner have a stock tip, it can pay to listen. After all, the newsletter they have run for over a decade, Motley Fool Stock Advisor, has tripled the market.*

David and Tom just revealed what they believe are the 10 best stocks for investors to buy right now... and IBM wasn't one of them! That's right -- they think these 10 stocks are even better buys.

*Stock Advisor returns as of June 5, 2017

Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool's board of directors. Andrew Tonner has no position in any stocks mentioned. The Motley Fool owns shares of and recommends Alphabet (A shares), Alphabet (C shares), and Amazon. The Motley Fool recommends Intel. The Motley Fool has a disclosure policy.

Visit link:
The 3 Best Server Companies to Buy in 2017 - Madison.com