Page 4,234«..1020..4,2334,2344,2354,236..4,2404,250..»

Serverless Architectures from an MSP’s Point of View – MSPmentor

Serverless architectures are getting a lot of focus now as a next-gen application platform and potentially as a successor to containers on public cloud platforms.

The reason for the interest is clear: companies want to minimize infrastructure management overhead by relying on the cloud platform to orchestrate cloud services rather than managing virtual servers themselves.

Public cloud platforms have already removed the burden of managing physical infrastructure.

Now serverless architecture stitches together various cloud services so that companies can simplify IT management while automatically benefiting from the frequent release of new cloud service capabilities.

The leading cloud platforms have noted this interest and new serverless options are coming out seemingly every day.

Of course, there are plenty of caveats, such as having to fit within the constraints of the service or set of services, a lack of visibility into what was traditionally points of monitoring concern, and a lack of familiarity on the part of many development teams.

But even so, adoption of serverless is quite discernable.

I encountered my first serverless application last summer when a company came to Logicworks looking for a managed services partner.

The challenges were immediately apparent.

First, there was no infrastructure to manage.

Seems obvious, but from the perspective of an infrastructure MSP, were immediately looking for where we can add value.

Second, so much of the tooling and systems used to manage traditional infrastructure (or even fully virtualized infrastructure such as Amazon EC2) simply had no place in this world.

The challenges were like what we faced with the rise of containerized applications, but more so.

In a containerized system, we found ways to do intrusion detection, log aggregation and host-level monitoring because we still had access to the host OS.

In a serverless system, customers that must meet PCI, HIPAA or HITRUST standards will have to wait for serverless-ready solutions.

These are real challenges for an MSP and while solutions will come that help companies achieve specific compliance requirements.

The bigger question faced by our industry is: what we do when theres nothing to manage but the application itself.

We had encountered similar challenges with the rise of Platform as a Service (PaaS) but there was always room to run the more complex workloads that didnt fit within those constraints.

Now that the code is running directly on the services and the number and capabilities of those services have grown so much, the line between infrastructure and code is blurring even further than the traditional understanding of infrastructure as code (IaC).

Of course, while new methods and tools are always arriving in IT, almost nothing ever goes away entirely.

We still have mainframes and we will have monolithic applications running on traditional topologies for some time.

There will be room for an MSP to add value for a very long time.

But Im still prompted to think about our place in a serverless world.

As an industry, we want to do more than just manage what will ultimately become legacy workloads.

And frankly, keeping good talent requires that we keep working on the cutting edge in addition to older platforms.

Also, we need to be able to work with those risk-welcoming verticals to be experts as the platforms mature and come into use by the slower moving, more risk-averse businesses down the line.

The most obvious take is that we will move up the stack as weve already done, getting closer to and sometimes owning the code deployment process, integrating with cloud services and providing an overlay of governance and expertise.

We would need to take this one step further by participating in the application architecture, suggesting services and helping teams integrate them.

Given the rate of change, having an MSP focused on keeping abreast of the tooling would be an asset.

Another nuance that differentiates serverless from PaaS is that we can use serverless services like Legos to build a platform or PaaS solution ourselves.

This would be a welcome difference from PaaS in that we can assemble the services into solutions that match our clients needs more completely, then continuously improve that deployment as new functionality is developed.

Examples of this from our team have included writing AWS Lambda functions on a clients behalf, incorporating AWS Simple Server Manager into our existing automation framework, and gluing together both cloud-native services and third party ISV offerings.

We quickly came to appreciate the speed and interoperability these serverless services provided.

So, while we dont know the future of serverless, were keeping our eye on this next stage of virtualization and are well aware that we need to stay in front of it.

We are increasingly encouraged by what we can do on our clients behalf using the serverless ecosystem.

Frankly, any MSP worth their salt is or should be having the same concerns.

Ken Ziegler is CEO of Logicworks.

Send tips and news to MSPmentorNews@Penton.com.

See the original post here:
Serverless Architectures from an MSP's Point of View - MSPmentor

Read More..

‘UK cloud services are strong and ready to compete with the world’s best tech economies’, Cloudreach CEO says as … – Data Economy

Brexit fails to halt data centre investments as operators enter a new phase of business evolution beyond pure colocation offerings.

Investment firm Palatine Private Equity has invested in UK-based data centre provider The Bunker, in what is the PE companys fifth deal out of Palatines third fund.

The Bunker was founded in 1994 and offers services around cloud, colocation and hosting. The operator has also over the years built a knowledge around fintech and the wider financial services sectors.

It has two data centres in England, one in Ash, Kent, and the other in Newbury. Each data centre has a power capacity of just below 2.5MW, however, the sites have the capacity to be expanded by several MWs.

Palatine said in a statement that the current levels of sales activity within the business show a rate of growth that will continue with the company set to benefit from market headwinds such as GDPR and an ever-greater demand for cyber security services.

In parallel to the new investment and in conjunction with long term business succession planning both Peregrine Newton (CEO) and Andy Theodorou (COO), both co-founders of the business, have decided to step aside.

Consulting director Andy Hague has been named as the new Group CEO, whilst Phil Bindley (CTO) has been promoted into the role of Managing Director with the rest of the Senior Management Team remaining in their current roles.

Palatine was advised by BDO, Gateley Plc and RSM with due diligence provided by CIL, Marsh and The Berkeley Partnership. Debt facilities were provided by European Capital and Santander.

Speaking to Data Economy, BDOs head of TMT M&A, Robin Brown, said: When you talk to a lot of British or European [data centre providers] as well as small data centre operators they are unsure at the moment about what they should be doing with their businesses.

Should they be focusing on just filling up their sites and getting a return from a colocation base; should they be moving into managed services; should they be moving to cybersecurity services; should they be buying or building more small data centres to have a regional play?

What this investment by Palatine private equity into The Bunker is showing, is that there is investor support for data centres to invest in more advanced managed services and cybersecurity capabilities. Their proposition is offering a technical solution to customers that is greater than just colocation.

The Bunker is designed to be ultra-secure and what they have done over the year is migrate the business from being just colocation services into managed hosting, disaster recovery, network services and dedicated cloud services. Part of the strategy of the business is to further move into cybersecurity services over time.

With The Bunker based in the UK, uncertainty around Brexit failed to hinder any negotiations, with Brown playing down questions around private equity investors fears.

He said: One of the interesting implications of some of the UK private equity funds is that the basis underpinning their investment in their funds is that their businesses are UK registered.

Quite often when they make an investment in Europe it is still being done by a UK registered business. They will set up, even if it is a shell company or a small acquisition, that is still run from the UK, whether over the longer term, you see more and more private equity funds being less constrained by the requirements of a UK registered business is a question that is open, it does not have an answer yet.

The Brexit effect has not been brought up as a significant impact on why they would or would not invest in the UK.

Looking ahead, and with investors showing more interest in the data centre market beyond pure-play colocation, Brown also said the investment carried out by Palatine could spark a wave of private equity investments.

The private equity I am talking to at the moment are specifically interested in investing in data centres that have made the migration towards managed services, he said.

Small data centres that are continuing to just do colocation are less attractive. Data centres that are moving into fields like cybersecurity, are worth more to an investor and an investor understands how they can make further acquisitions to grow the business and not be constrained by the footprint of the data centre itself.

If they have services that are provided to clients that are in the data centre, they can then add additional services that they then can provide to those clients that do not rely solely on the data centre footprint. One example would be GDPR consulting services.

Read more:
'UK cloud services are strong and ready to compete with the world's best tech economies', Cloudreach CEO says as ... - Data Economy

Read More..

Marijuana startup Lemonhaze leverages Bizspark to jump from AWS to Azure – OnMSFT (blog)

Here at OnMSFT, we write a lot about cloud computing and the various tangential functions it encompasses such as machine learning, artificial intelligence, and data storage. We also take the time to crunch the numbers each quarter from some of the top names in the industry to see how each company is doing with trying to obtain or maintain the highly coveted market share crown.

But what does it all mean?

For one marijuana start up, it means ditching cloud computing incumbentAmazon for quickly rising software giant Microsofts cloud offerings.

I was recently told an anecdotal story of how Microsofts lesser publicized Bizspark program helped marijuana dispensaryLemonhaze shift its entire business off AWS and onto Azure.

While this migration tale is just one of a thousand stories of business switching back and forth between Amazon and Microsofts cloud computing solutions, Lemonhazesexperience highlights Microsofts developing concentration on helping small to medium sized businesses access the benefits of the cloud.

As Lemonhaze writes on their blog, Why did we make the move? Simple. Our AWS charges were eating through our limited capital resources.

While Lemonhaze has nothing but positive things to say about AWS as a service, it comes down to money as to why the company migrated over to Azure. Lemonhaze recounts a particularly costly encounter with AWS that help motivate the company to switch.

There was this one time an offshore team member inadvertently launched an RDS Instance on AWS that cost us three-thousand bucks. Thats three grand! We would have much preferred spending that money on legal reefers.

Previously, the words Microsoft, start up and money seemed to usually end in another AWS success story, but according to Lemonhaze, Microsoft has a notso secret weapon to help convert those terms in Azure success stories.

Bizspark.

Lemonhaze got accepted into the Bizspark program. With open arms. Thanks to Bizspark, Lemonhaze is now fully hosted in the Azure cloud for FREE! For a start-up, this is huge. It frees up capital that we have now allocated to other growth initiatives. Not only was migrating over a cinch, MSFT supplied us with professional service support to facilitate the migration. Seamless.

Worried about a perceptional rejection by Microsoft based on its involvement with marijuana, Lemonhaze cautiouslylooked into the Bizspark program not expecting much. As a start-up in the legal marijuana space, we have gotten accustomed to hearing the word no and the myriad of rejections we deal with on a regular basis, recounts the team.

Fortunately, Lemonhaze qualified for Bizspark and managed to save a whopping 100% on their web hosting. Lemonhaze is realistic about its future with Azure as Bizspark is mainly a testing and migration tool that offers credits to lure potential customers in. At some point, Lemonhaze will have to eventually shell out its own money to host its services on Azure but based on their positive experience and seamless transition, the team plans on staying with Azure for the foreseeablefuture.

We dig the BizSpark/Azure approach of assisting start-ups. Most start-ups fail. At Lemonhaze we fail every day. When working with AWS, the only free customer support available was for billing questions. Theres nothing scarier to a start-up than having to part each month with a nice chunk of change, without the ability to assess if we are optimizing our spend. One wrong move on AWS can wreak havoc on the finances/future of a start-up. Note that when we did run into a critical technical issue, we were required to pay an additional fee for support with AWS.

Again, this is just one in a long list tales that Microsoft will use to continue to pitch its Azure service and the companys new found focus on tailoring its product for the little guys now. We must also keep in mind, Amazon is in no threat of losing its cloud computing cache or its multitude of customers anytime soon, but Lemonhazesclosing words should put a smile on the Azure teams face while worrying the AWS team a bit.

Most small business owners I know are petrified of getting crushed by Amazon. Ditto with Amazon sellers. And brands. (P.S. This is the main reason why we are relieved to no longer be sharing any info/data with Amazon.)

It may only be a few rumblings here and there, but once a narrative takes hold, it can be hard for a company to overcome its momentum. Fortunately, Microsoft knows a thing or two about trying to overcoming narratives, it remains to be seen how Amazon handles theirs.

Go here to see the original:
Marijuana startup Lemonhaze leverages Bizspark to jump from AWS to Azure - OnMSFT (blog)

Read More..

Crypto-Startup Hubcoin Announces Distribution of Pre-Mined HUB to Altcoin Developers – Bitcoinist

Bitcoin PR Buzz August 2, 2017 11:45 am

Belarus-based blockchain company Hubcoin has announced that it has distributed 300,000 (10%) of the total pre-mined HUB tokens to a chosen group of different altcoin developers. The promotion took place alongside the live Hubcoin ICO.

[Note: This is a press release.]

Hubcoin believes that the launches of fair coins have been successful but that there is still a lot of potential to get more out of their protocol. Influenced by successful premine distribution projects such as NEM and Auroracoin, Hubcoin chose to give away 10% of the platforms HUB tokens to leading blockchain developers of previously established cryptocurrency projects. The company thinks that it would be beneficial for the company in the long run as it would stimulate interest amongst the community.

From the 10% of HUB tokens distributed, 7% was given to the developers of various altcoins that are already in the market. The distribution of HUB tokens was organized in proportion to the respective coins total market capitalization. 2% of the coins were equally distributed between all the altcoin developers currently holding a wallet, block explorer, or topic on the Hubcoin website and 1% is held by the projects lead developer.

Hubcoin has implemented a unique technical code that limits the premine spending for the first year. The use of the allocated Hubcoins by developers is designed to provide sustained motivation for all of the development teams and mitigate against early token dumps.

The project has outlined a number of different milestones from its roadmap that are listed in order of cost:

2 BTC: A coin tracking service like Coinmarketcap or Coinwarz capable of sorting coins by an algorithm and will list new coins much faster.

4 BTC: Hosting for interested altcoin developers for free (limited slots). The service will be extended to non-developers as well, for a small charge. Additional services like website creation, DDOS defense and more will also be provided. Block explorer for coins will be created for those needing it. Premium features and priority coin listing will be available for a fixed Hubcoin payment.

6 BTC: Create Block Explorers for altcoins that dont have one. Coin creators would be able to add their own coins to the service for free. It will support urgent coin additions for a small charge.

8 BTC: Pool creation for all coins requiring the feature. The coin creators will be allowed to add their coins to Hubcoins aforementioned services at no cost (free slots will be limited). Premium features and priority coin listing will be available for fixed HUB payment.

12 BTC: Node maintenance service will be created at this stage. The first three altcoin developers will get this service for free while others who follow will be charged a small HUB token payment.

20 BTC: Desktop and web wallet service will be provided. It will be free for altcoin developers. Premium features will be available for a fixed HUB payment.

30 BTC: Android wallet creation service will be made available for free to altcoin developers. The platform will offer premium features for a fixed HUB payment.

50 BTC: Hubcoin will start offering its own coin creation service, which will be free of charges with a 10% premine share. Those opting coins without premine or custom premine can avail the service by making a fixed HUB payment.

100 BTC: Promotional services for coins will be created enabling facilities like voting to secure more recognition. Other promotional services such as press releases, social media, signature campaigns, advertisement, fakeouts search for investors, etc. will be provided.

200+ BTC: Reviving support for unsuccessful coins will be provided by repairing stuck block chains, connection issues, hard forks, creating websites, adding block-explorers and services, and a full work on the coin. Hubcoin owners will be able to vote by paying Hubcoin and choose coin for reviving.

Saracenis (Topaz coin), Belligerent Fool (BenjiRolls), UsuallyHappens (PartyCoin), PhoenixWarrior333 (FidgetCoin), CryptoWiz420 (Turbostake), findblocks.com(findblocks.com), AtomicProject (Atomic Coin), mbmagnat (Evotion), bumbacoin (BumbaCoin), victoriouscoin(Victoriouscoin), Bzzzum (Deutsche eMark), TrollCoins (TrollCoins), Whitey92d15b7 (GPU Coin), ACP (AnarchistsPrime), OBAViJEST (Doubloon) , soulgate (Virtacoinplus), joshafest (B3Coin), LiftOff1969 (UDOWNcoin), pallas (Cryptonite), vashshawn (corgicoinV2), DreamCrusherFTW (MOZZI), notnormals (InsaneCoin).

Images courtesy of Hubcoin, Shutterstock

Read more:
Crypto-Startup Hubcoin Announces Distribution of Pre-Mined HUB to Altcoin Developers - Bitcoinist

Read More..

Team sets new record for magnetic tape storagemakes tape competitive for cloud storage – Phys.Org

In this photo, IBM scientist Dr. Mark Lantz, holds a one square inch piece of Sony Storage Media Solutions sputtered tape, which can hold 201 Gigabytes, a new world record. Credit: IBM Research

Research scientists have achieved a new world record in tape storage their fifth since 2006. The new record of 201 Gb/in2 (gigabits per square inch) in areal density was achieved on a prototype sputtered magnetic tape developed by Sony Storage Media Solutions. The scientists presented the achievement today at the 28th Magnetic Recording Conference (TMRC 2017) here.

Tape storage is currently the most secure, energy efficient and cost-effective solution for storing enormous amounts of back-up and archival data, as well as for new applications such as Big Data and cloud computing.

This new record areal recording density is more than 20 times the areal density used in current state of the art commercial tape drives such as the IBM TS1155 enterprise tape drive, and it enables the potential to record up to about 330 terabytes (TB) of uncompressed data on a single tape cartridge that would fit in the palm of your hand.

Assuming the same format overheads as the TS1155 format and taking into account the 6.4% increase in tape length enabled by the thinner demo tape. A TS1155 JD cartridge, can hold 15 TB of uncompressed data in a 4.29 in. x 4.92 in. x 0.96 in. (109.0 mm x 125 mm x 24.5 mm) form factor.

330 terabytes of data are comparable to the text of 330 million books, which would fill a bookshelf that stretches slightly beyond the northeastern to the southwestern most tips of Japan.

The video will load shortly

Magnetic tape data storage is currently experiencing a renaissance. With this achievement, IBM scientists demonstrate the viability of continuing to scale the tape roadmap for another decade.

"Tape has traditionally been used for video archives, back-up files, replicas for disaster recovery and retention of information on premise, but the industry is also expanding to off-premise applications in the cloud," said IBM Fellow Evangelos Eleftheriou. "While sputtered tape is expected to cost a little more to manufacture than current commercial tape that uses Barium ferrite (BaFe), the potential for very high capacity will make the cost per TB very attractive, making this technology practical for cold storage in the cloud."

To achieve 201 billion bits per square inch, IBM researchers developed several new technologies, including:

IBM has been working closely with Sony Storage Media Solutions for several years, particularly on enabling increased areal recording densities. The results of this collaboration have led to various improvements in the media technology, such as advanced roll-to-roll technology for long sputtered tape fabrication and better lubricant technology, which stabilizes the functionality of the magnetic tape.

Many of the technologies developed and used in the areal density demonstrations are later incorporated into future tape products. Two notable examples from 2007 include an advanced noise predictive maximum likelihood read channel and first generation BaFe tape media.

IBM has a long history of innovation in magnetic tape data storage. Its first commercial tape product, the 726 Magnetic Tape Unit, was announced more than 60 years ago. It used reels of half-inch-wide tape that each had a capacity of about 2 megabytes. The areal density demonstration announced today represents a potential increase in capacity of 165,000,000 times compared with IBM's first tape drive product. This announcement reaffirms IBM's ongoing commitment and leadership in magnetic tape technology.

Explore further: Tape storage milestone demonstrates record in areal density of 123 billion bits per square inch

More information: Simeon Furrer et al. 201 Gb/in Recording Areal Density on Sputtered Magnetic Tape, IEEE Transactions on Magnetics (2017). DOI: 10.1109/TMAG.2017.2727822

Journal reference: IEEE Transactions on Magnetics

Provided by: IBM

Link:
Team sets new record for magnetic tape storagemakes tape competitive for cloud storage - Phys.Org

Read More..

Attala Systems Announces Its High Performance Composable Storage Infrastructure Technology for Cloud and Analytics – PR Newswire (press release)

"As flash proliferates as the de facto storage media for cloud infrastructure, adapting existing storage architectures to the cloud can be counterproductive," said Mike Heumann, Managing Partner, G2M Research. "Conventional AFAs and SDS appliances architectures can bring significant performance inefficiencies and costs to cloud environments. Attala Systems' combination of an FPGA-based solution and self-learning orchestration and provisioning capabilities, has the potential to remove many of the inefficiencies of conventional storage architectures."

Operators of cloud-based storage and real-time analytic systems are always seeking new ways to scale their infrastructure to support more clients, transactions or revenue-generating services, while providing superior performance, agility and reduced operational costs. A flexible data networking infrastructure, combined with automated provisioning and orchestration, significantly reduces TCO and operational costs. Attala Systems end-to-end solution utilizes FPGA-based hardware to accomplish these goals, and is purpose-built for cloud storage infrastructure. By being an end-to-end system, it also avoids the "DIY" performance and deployment risks associated with 3rd party/open-source software based solutions. The result is an adaptable storage infrastructure that is essentially an elastic block storage (EBS) solution "on steroids".

"We are excited to be working with Attala Systems," said Dan McNamara, Corporate VP of the Programmable Solutions Group at Intel. "Intel FPGAs are enabling the next-level of capability for cloud and Enterprise infrastructure. By taking advantage of the flexibility of Intel FPGAs to accelerate networked NVMe performance and automation, Attala Systems is providing new and significant capabilities to cloud storage operators."

Attala Systems executives have a proven track record of creating and commercializing disruptive data center technologies. They include Sujith Arramreddy, Taufik Ma, and Sai Gadiraju. Sujith was a cofounder of both ServerWorks and ServerEngines, where he architected entire product lines and helped drive the acquisition of those companies by larger established companies. Taufik was one of the key executives responsible for the growth of Intel's CPU and chipset business, and helped drive Aarohi Communications and Emulex to success as a marketing and product management executive. Sai was a cofounder of ServerWorks and ServerEngines, where he was an engineering and operations executive responsible for developing and commercializing dozens of product lines over his career. Rounding out Attala Systems is an advisory board consisting of Jay Kidd (SVP and CTO roles at NetApp and Brocade) and Chris McBride (multiple customer-facing executive roles at Baffle, BlueArc, McData, Hitachi Data Systems, and other companies).

"Our focus from the ground up has been on designing and building a cloud-optimized storage infrastructure," said Taufik Ma, co-founder at Attala Systems. "In discussions with our customers, the need to build storage differently than it has been done in enterprises always comes across unambiguously. This is why companies such as Microsoft and Amazon opted for brand new approaches. Our architecture, developed with help from Intel's Programmable Solutions Group, is the realization of that need."

Please come by and visit Attala Systems in Booth 848 at the Flash Memory Summit, August 8th-10th, 2017 in the Santa Clara CA Convention Center.

About Attala SystemsFounded in 2015 with headquarters in San Jose, Calif., Attala Systems is an early-stage technology company focused on the design and development of a new generation of storage and networking infrastructure based on the use of FPGAs and cloud-focused self-learning orchestration and provisioning software. By freeing storage architectures from the multiple levels of abstraction inherent in enterprise-based storage systems, Attala significantly improves system performance and reduces operational costs for cloud providers and those with a need for high-performance, low latency storage systems.

Press/Media/Analyst Contact: G2M Communicationsmedia_relations@g2minc.comTelephone: 858-610-9708

View original content:http://www.prnewswire.com/news-releases/attala-systems-announces-its-high-performance-composable-storage-infrastructure-technology-for-cloud-and-analytics-300498775.html

SOURCE Attala Systems

See original here:
Attala Systems Announces Its High Performance Composable Storage Infrastructure Technology for Cloud and Analytics - PR Newswire (press release)

Read More..

Mozilla bets its Rust language will make your internet safer – CNET

Mozilla's Rust programming language is designed to be more secure.

Twenty-two years ago, Mozilla co-founder Brendan Eich whipped up JavaScript in 10 days of manic activity in 1995. It's since become the world's most popular programming language.

Now Mozilla hopes lightning will strike twice with a sequel called Rust. If successful, the new programming language could be as important as JavaScript to Mozilla's mission of making the internet better for us all -- and it could be just as helpful to the nonprofit organization's reputation, too. JavaScript let web developers make websites interactive, triggering an explosion of innovation like online photo editing, scrolling and zooming maps, and word processing with Google Docs. Rust is a lower-level tool, though, that could make Mozilla's Firefox web browser faster and more secure.

And if its enthusiastic reception in programming circles continues, Rust could help protect a lot of other software from attacks that today are the bane of online existence. Attackers who have learned to exploit vulnerabilities in internet-linked software are responsible for stolen identities, drained bank accounts, leaked confidential documents and political persecution.

Rust, the "most loved" language in a 2017 survey by programming advice site Stack Overflow, won allies like online storage service Dropbox. Programmers have contributed tens of thousands of packages of pre-written code to help others get their projects moving faster, too. Need to decode a web address, check the time, or handle some video? Somebody's probably already written the basics for you.

In a sign of growing interest, programmers are steadily increasing the number of pre-written Rust packages. That make it faster to get started with a Rust software project.

"Rust is growing," says Redmonk analyst Stephen O'Grady, who tracks language popularity. "It made a substantial jump and is now solidly in second-tier language territory -- which is good company to be in."

It's really hard to create a major language like C, C++ or Java used by millions of programmers. Mozilla succeeded with JavaScript, with countless others helping to refine it over the years.

Looking good in programmers' eyes improves Mozilla's reputation as an innovator, not a laggard to be left behind. "The language is exciting," says Jonathan "Duke" Leto, the founder of programming firm Leto Labs. Restoring that reputation is important for Mozilla's attempt to ignite enthusiasm for its Firefox browser and use it for goals like protecting your privacy or making sure Google's websites don't require you to use Google's browser.

The biggest reason to like Rust is that it can wipe out a huge class of software security holes -- a major problem with browsers that today handle everything from our most personal communications to our biggest financial transactions. Even if you're not a programmer, you'll like a more secure internet.

"Every big piece of Rust code we get in there decreases the attack surface for security holes in the browser," says Dave Herman, Mozilla's director of strategy and research.

Rust's safety lets Mozilla free up Firefox memory, too, a key computing resource these days as we keep so many browser tabs open. Rust is also designed to better handle the thorny computing challenge of doing many tasks in parallel -- that's key to unlocking the power of modern chips that have many processing engines.

Even if nobody outside Mozilla ever touches Rust, it will directly help Firefox. Mozilla moves Rust-written components into Firefox through a project fittingly called Oxidation. Indeed, Rust and Oxidation are key to a project called Quantum to speed up Firefox with the release of Firefox 57 this November.

Sean White, Mozilla's vice president of technology strategy

"You can try a lot of different experiments," says Sean White, Mozilla's senior vice president leading the emerging technologies. "We have way we can very quickly try and fail on things without touching the hundreds of millions of people using Firefox."

The source for these Rust components is new core browser software called Servo, a Mozilla research project that's written mostly in Rust.

Going whole hog and building a new browser entirely on Servo would be risky, though. Instead, Mozilla is cherry-picking the best parts and adding them to Firefox's core, called Gecko. "The future is intelligently managing the combination of the two," says David Bryant, Mozilla's vice president of platform engineering.

In the longer run, Mozilla wants Servo to be useful on its own. It struggles today even with basic web documents like Wikipedia. Mozilla's goal of getting it to work with the much more complicated Google Docs site is actually very ambitious. If successful, though, it would signal a major step forward in overall practical readiness.

David Bryant, Mozilla's VP of platform engineering

And Mozilla is considering making a version of Servo that can be embedded into smaller computing devices, White says. One possible example: a VR headset that displays virtual reality worlds constructed with the WebVR technology Mozilla helped create. Servo is designed to take advantage of modern computer chips that can run lots of tasks in parallel, and success there could make Servo very efficient on inexpensive hardware.

Another programmer, Robert O'Callahan, is such a big Rust fan that he quit Mozilla to work on Rust programming tools at his startup, Pernosco. Most languages either give programmers low-level control or protection against memory-induced security problems, but not both. "Rust is the first mainstream language to escape that tradeoff," O'Callahan says.

Even if you don't care much about programmers toiling over their keyboards, you should care about that Rust advantage. With governments and identity thieves paying top dollar for computer attack software, everyone on the internet can be a potential victim.

Intolerance on the Internet: Online abuse is as old as the internet and it's only getting worse. It exacts a very real toll.

It's Complicated: This is dating in the age of apps. Having fun yet? These stories get to the heart of the matter.

Follow this link:
Mozilla bets its Rust language will make your internet safer - CNET

Read More..

Kaspersky Internet Security 2018 18.0.0.405 – Tech Advisor

Kaspersky Internet Security 2018is a powerful suite of malware-hunting, anti-hacker, websafetytools. The package isn't officially out yet, but the download is fully functional and is effectively a release preview.

There's antivirus. Browsing protection. A firewall. Exploit protection. A vulnerability scanner, parental controls, webcam and audio protection, online transaction protection, and more.

These features have real value, too. Independent testing labs such as AV-Comparatives typically rate Kaspersky as offering some of the best protection around.

There'stheSoftware Updater, whichchecks for updates to common applications (Adobe Reader, Flash, Java, Chrome, Firefox, more), and can optionally install them without you having to see or do anything at all.

A "Secure Connection" feature is essentially a privacy-oriented VPN (virtual private network), automatically kicking in to protect you when using wifi hotspots, web banking sites and more.

An Installation Assistance tool looks out for adware and other pests that get silently installed with some "free" software, and the Software Cleaner helps you decide what to remove.

What's new in KasperskyInternet Security2018?Your computer is now protected whilst it's either booting or restarting, there's better protection against third-party manipulation of files on your hard drive, additional notifications to warn against potential malicious websites and, for any reseller, Kaspersky has made it much easier for a user to be able to add a new licence to an expired subscription rather than forcing the user to renew.

Kaspersky Internet Security 2018 is an excellent security suite with someworthwhile improvements and enhancements. It's still one of the best packages around.

Originally posted here:
Kaspersky Internet Security 2018 18.0.0.405 - Tech Advisor

Read More..

Cloud Computing and Collaborative MBSE Comes to HyperWorks – ENGINEERING.com

Custom cloud computing appliances can be accessed from most devices. (Image courtesy of PBS Works.)

Users can mix and match resources to configure their HPC

Engineers can configure and manage cloud resources using PBScloud.io. (Image courtesy of PBS Works.)

You can try PBScloud.io for free at this link, or learn more about the solution on the PBS Works website.

Engineers can connect engineering analysis with system models by adding the ModelCenter tool from the HyperWorks simulation platform. Users can create and automate a multitool workflow within the framework of ModelCenter to integrate analysis performed using different platforms and using tools from multiple vendors. Users can select the optimal analysis or simulation tool from preferred vendors and interconnect the input and output data to form a single unified workflow.

Engineers can customize workflows with ModelCenter. (Image courtesy of Altair.)

Engineers can connect the engineering analysis to the systems model using the ModelCenter MBSEPak. With this model-based systems engineering (MBSE) approach, users can combine preprocessing, solving, post-processing, visualization and reporting tools from various sources to enable multidisciplinary analysis, validation and simulation.

Users can integrate their preferred tools from nearly any software application into the ModelCenter workflow. For example, user-generated tools, legacy FORTRAN programs, C++ applications, spreadsheets, mathematical models, databases, models from CAD tools, CAE models, and others can be combined into the workflow. Users can then optimize the workflow to use HPC resources, iterate part or all of the workflow with different data, and manage the results. Engineers can very quickly explore and optimize performance, cost, reliability and risk for a number of different design alternatives using this approach.

Here is the original post:
Cloud Computing and Collaborative MBSE Comes to HyperWorks - ENGINEERING.com

Read More..

UK flip-flop on encryption doesn’t help anyone – CNET

In the debate over encrypting our private communications and giving the government backdoor access to better thwart terrorism, it's hard to tell where the British government stands.

"Encryption plays a fundamental role in protecting us all online."

"We need to make sure that our intelligence services have the ability to get into situations like encrypted WhatsApp."

"To be very clear Government supports strong encryption and has no intention of banning end-to-end encryption."

"There is a problem in terms of the growth of end-to-end encryption."

These statements might sound contradictory, but they have one thing in common: they can all be attributed to the UK's Home Secretary Amber Rudd.

Rudd has said all of these things and more about encryption in various speeches, interviewsover the past few months and aself-penned articlesearlier this week. It's not just you. From reading these statements, even in context, it's all pretty confusing.

The comments are just the latest turn in the debate over encryption, which has become a bugbear of the British government in the wake of multiple terror attacks in the UK during 2017. While he protections guard our privacy, they also prevent the authorities from being able to read messages between terrorists. Prime Minister Theresa May has called multiple times on tech companies to "do more" to tackle the terror threat. Rudd, ahead of attending theGlobal Internet Forum to Counter Terrorism on Tuesdaywrote an editorial in the Telegraph saying that the UK isn't looking to ban encryption, but does want some kind of change.

The back and forth from Rudd is counterproductive because she's seemingly seeking a middle ground that doesn't exist. By parsing her statements, Rudd appears to suggest a version of encryption that is almost, but not absolutely, unbreakable. But end-to-end encryption means that not even the companies that create and enforce security measures can decrypt your messages, so the idea of an emergency access point seems far-fetched.

"Amber Rudd must be absolutely clear on what co-operation she expects from Internet companies," said Jim Killock, executive director of UK digital rights campaign Open Rights Group. "She is causing immense confusion because at the moment she sounds like she is asking for the impossible."

It's not like tech companies aren't willing to help. Facebook, Twitter and Google have shown willingnesswork with governments on tackling terrorism.

Amber Rudd speaking at the Global Internet Forum to Counter Terrorism

But they aren't bending on the issue of putting in backdoors for government access. As tech companies and security experts have repeatedly pointed out, if the companies themselves have a way of accessing these communications, so potentially do more malicious people.

Breakable encryption could also, as numerous experts, including Facebook Chief Operating Officer Sheryl Sandberg, point out, chase terrorists onto other platforms that aren't as willing to cooperate with governments.

"If people move off those encrypted services to go to encrypted services in countries that won't share the metadata, the government actually has less information, not more," Sandberg said in an interview broadcast by the BBC last week.

In fact, it's already happening. On Wednesday, three men were found guilty in the UK of plotting a terrorist attack and had been using the encrypted app Telegram to communicate with one another. Telegram was called out by Europol chief Rob Wainwright earlier this year for "causing major problems," by not being cooperative with law enforcement.

Another allegation Rudd has leveled at end-to-end encryption is that "real people" don't care about it. People don't use WhatsApp because it is secure, she said in her Telegraph editorial, but because it is convenient, cheap and user-friendly. This is more than a huge generalization, it's an assertion for which she provides absolutely no supporting evidence.

Indeed, her comments have attracted criticism from privacy organization Big Brother Watch, which said they were "at best nave, at worst dangerous."

"Suggesting that people don't really want security from their online services is frankly insulting, what of those in society who are in dangerous or vulnerable situations, let alone those of us who simply want to protect our communications from breach, hack or cybercrime," said Renate Samson, the organization's chief executive in a statement.

"Once again the Government are attempting to undermine the security of all in response to the actions of a few," he said. "We are all digital citizens, we all deserve security in the digital space."

Rudd maintains "there are options" for using end-to-end encryption and also making sure terrorists "have no place to hide" online. But what these options are seem to be a mystery to everyone but her. For the sake of the British public, many of whom do care that their communications are kept private and secure, she needs to explain how this will work.

The Smartest Stuff: Innovators are thinking up new ways to make you, and the things around you, smarter. Here's what they're up to.

Intolerance on the Internet: Online abuse is as old as the internet and it's only getting worse. It exacts a very real toll.

Read the rest here:
UK flip-flop on encryption doesn't help anyone - CNET

Read More..