Page 4,241«..1020..4,2404,2414,2424,243..4,2504,260..»

BOXX Expands Product Line and Services by Acquiring Cirrascale – GlobeNewswire (press release)

May 08, 2017 08:00 ET | Source: BOXX Technologies

Austin, TX, May 08, 2017 (GLOBE NEWSWIRE) -- AUSTIN, TEXAS, May 8, 2017 BOXX Technologies, the leading innovator of high-performance computer workstations, rendering systems, and servers, today announced the acquisition of Cirrascale Corporation, a premier developer of multi-GPU servers and cloud solutions designed for deep learning infrastructure. The acquisition enables BOXX to add Cirrascales deep learning hardware to its line of multi-GPU solutions and solidifies BOXX as the leader in multi-GPU computer technology. Cirrascale Cloud Services, a BOXX subsidiary, will continue to expand its growing business and provide GPU-as-a-Service along with other professional services.

Cirrascale is instantly recognizable as a leader in deep learning infrastructure, cloud services, and like us, a strategic partner of NVIDIA, so naturally, were excited to welcome them to the BOXX family, said Rick Krause, BOXX CEO. We now have a complete solution of world-class deep learning servers, development workstations, and cloud services for data scientists, researchers, and other professionals.

Cirrascale Cloud Services offers a dedicated, bare-metal cloud service with the ability for customers to load their own instances of popular deep learning frameworks such as TensorFlow, Caffe, MXNet, and Theano. This provides user access to the raw horsepower of a modern multi-GPU system and is highly attractive to customers with various deep learning and HPC applications. BOXX will manufacture the high performance rackmount systems featuring up to eight NVIDIA Quadro or Tesla graphics cards.

With expertise in the development and manufacturing of high-performance systems, BOXX will now deliver deep learning solutions to customers worldwide while providing services and support to meet their needs, said PJ Go, CEO, Cirrascale Cloud Services. This enables our team to continue to expand our cloud services which have grown exponentially over the past year. Together, our companies will further accelerate the ever growing momentum of machine learning and artificial intelligence.

BOXX began primarily in media & entertainment, producing the fastest and highest quality hardware solutions for VFX, animation, and motion media applications. However, over the past decade, the hardware manufacturer has expanded to also develop products specific to architecture, engineering, and other markets which rely on professional 3D design applications. Multi-GPU workstations have always represented a significant portion of BOXX business, but as the company continues to add enterprise customers like broadcast networks and organizations focused on deep learning, the acquisition of Cirrascale is a natural fit. The acquisition comes as BOXX, owned by Dallas-based private equity firm Craftsman Capital, is experiencing record growth.

BOXX is an incredible brand built on delivering the highest possible performance to the customer, said Barrett Dean, Partner at Craftsman Capital. With expanded resources and knowledge base, we have complete confidence that consumers will view BOXX as a one stop shop for deep learning infrastructure and cloud services. The highly experienced BOXX management team has done an outstanding job of building BOXX as a premium solution provider, so the addition of Cirrascale further expands the BOXX brand into new markets.

For further information and pricing on multi-GPU servers, contact BOXX at 1-877-877-2699. Learn more about multi-GPU servers, APEXX workstations, BOXX rendering solutions, BOXX finance options, and how to contact worldwide resellers, by visiting http://www.boxx.com. For further information on Cirrascale Cloud Services or to rent the latest x86 and POWER configurations, call (888) 942-3800 or visit http://www.cirrascale.cloud.

About BOXX Technologies

BOXX is the leading innovator of high-performance workstations and rendering systems for visual effects, animation, product design, engineering, architectural visualization, and more. Combining record-setting performance, speed, and reliability with unparalleled industry knowledge, BOXX is the trusted choice for creative professionals worldwide. For more information, visit http://www.boxx.com.

About Cirrascale Cloud Services Cirrascale Cloud Services is a premier provider of dedicated public and private, multi-GPU cloud solutions enabling deep learning. The company offers the latest cloud-based solutions for large-scale deep learning operators, service providers, as well as HPC users. To learn more about Cirrascale Cloud Services and its unique dedicated, multi-GPU cloud solutions, please visit http://www.cirrascale.cloud or call (888) 942-3800.

Attachments:

A photo accompanying this announcement is available at http://www.globenewswire.com/NewsRoom/AttachmentNg/2d54d8d7-1b2b-46ae-91ac-e24fe9930074

Related Articles

View post:
BOXX Expands Product Line and Services by Acquiring Cirrascale - GlobeNewswire (press release)

Read More..

Transforming the datacenter – ZDNet

The biggest fear early cloud adopters overcame was the fear of disruption. How would they move everything from their existing data center to a cloud service without disrupting business operations? The answer was, they didn't.

Smart companies realized that the safest way to transition from on-premises systems to the cloud would be to first assess all their applications, workloads, and other data assets to determine which would be most amenable to the cloud transition. They began with low-criticality workloads and slowly worked their way up the list. Some are still working through that process.

For those organizations, a new concern emerged. How would they manage an environment that basically consisted of two network cores; one still on-premises and the other now in a cloud facility? Would they need to maintain two separate environments? Two management consoles? Two sets of protocols? What would happen when they needed to combine resources from both environments to accomplish something?

The alignment of Active Directory for Windows Server with Active Directory for Azure meant that data center managers could create a unified, seamless system that combined both on-premises and cloud-resident systems under one AD forest. This enables IT to use one set of protocols, one set of user credentials, one set of security standards, and one unified data environment.

It also facilitated the smooth movement of more applications, workloads, and other data assets from the on-prem network to the cloud. Once the two are aligned, it becomes almost effortless to provision new virtual machines and new resource stacks, and move them across the network.

Gartner recently predicted that, by 2020, the "cloud shift" will affect more than $1 trillion in IT spending. That means that more organizations will be migrating more of their IT operations to the cloud.Some will still need to port, modify, or rewrite some applications before they can be migrated. The recent embrace of Open Systems by Microsoft clearly signals Redmond's commitment to assist in this challenge. One way in which the company is facilitating cloud migration is by developing more and more enhancements to Azure services that will accommodate a broader variety of platforms and environments.

Many resources combine to provide a complete Azure service for your applications. These include virtual machines (VM), your storage account, web apps, databases, and your virtual network (VLAN). Originally, when you managed an Azure cloud environment, you created, deployed, and managed each of these separately. When adjustments needed to be made, you went to the required consoles manually. When you wanted to replicate your solution in another Azure environment, you had to recreate everything.

The introduction of Azure Resource Management (ARM) in 2014 changed all that, mainly by providing resource groups, which are defined by Microsoft as "a container that holds related resources for an Azure solution. The resource group can include all the resources for the solution, or only those resources that you want to manage as a group. You decide how you want to allocate resources to resource groups based on what makes the most sense for your organization." Now all the required resources can be managed as a group, which saves tremendous time and effort.

As the cloud revolution marched on, it became clear that cloud customers enjoy better IT services at a lower cost. It made little sense to own and operate your own servers when a cloud provider could deliver a professionally operated data service, while you pay only for what you use.

Now, all of the tools required to ease the path from on-premises to cloud facilities are emerging very quickly. Companies can move from on-premises to robust co-existence and then finally complete the journey to the cloud on their own schedule with virtually no disruption whatsoever.

Read more:
Transforming the datacenter - ZDNet

Read More..

The cloud computing tidal wave – BetaNews

The title above is a play on the famous Bill Gates memo, The Internet Tidal Wave, written in May, 1995. Gates, on one of his reading weeks, realized that the Internet was the future of IT and Microsoft, through Gatess own miscalculation, was then barely part of that future. So he wrote the memo, turned the company around, built Internet Explorer, and changed the course of business history.

Thats how people tend to read the memo, as a snapshot of technical brilliance and ambition. But the inspiration for the Gates memo was another document, The Final Days of Autodesk, written in 1991 by Autodesk CEO John Walker. Walkers memo was not about how the future could be saved, but about how seemingly invincible market advantages could be quickly lost. If Autodesk, the Computer Aided Design pioneer, was ever going to die, this was how Walker figured it would happen. And Gates believed him. Now its about to happen again. Amazon Web Services -- the first and still largest public computing cloud -- is 11 years old, which is old enough for there not only to be some clear cloud computing winners (AWS, Microsoft Azure and a bunch of startups) but some obvious losers, too. This rising tide is not raising all ships. Thats why its time for the Cloud Computing Tidal Wave.

In the world of computing, almost every platform transition creates a new market giant. Old companies generally die to make way for new companies. Univac and Burroughs were parts of the mainframe era that didnt survive, replaced by minicomputers from companies like Digital, Data General and Prime. Those companies in turn gave way to personal computing pioneers like Apple, Compaq, and Microsoft. Only IBM seemed to remain a constant from one hardware generation to the next. But now were in the mobile era and IBM has almost no presence there, so the platform transition rule may still hold true.

The new things the cloud and that wave will have its new champions, too, as well as losers. Weve tended to focus our attention on providers of cloud hosting services, but the cloud is much more than data centers and servers. Its applications and services, too, and hardly any of those are coming from old guard companies.

First among the losers in cloud computing are the venerable mainframes that survive today mainly because Big Business still relies on a lot of old COBOL code -- code too big to be comfortable on a PC or even a minicomputer. But the cloud scales infinitely and COBOL is heading there and it can only hurt mainframe computer makers.

Suffering, too are the personal computer makers. As processing moves from the desktop to the cloud, desktops get punier, cheaper, and less profitable. Theres money to be made in the initial transformation from desktop to cloud, but what happens when all those desktops have been replaced? For the most part they wont need to be upgraded ever. The three-year PC upgrade cycle for businesses is already being disrupted. I am writing this column on a mid-2010 Apple MacBook Pro -- a seven year old computer I have no plans to replace because it works just fine, thanks to the boost it gets from cloud services.

In every platform transition there are companies that probably cant make the jump. One of those that stands out today especially because it has been in the news is Citrix Systems, the Virtual Desktop Infrastructure (VDI) pioneer. VDI is, on first glance, a lot like the cloud. Citrix even refers to itself as a "cloud services company." But VDI isnt the cloud. VDI allows businesses to make one PC serve several users or one server help dozens or hundreds. But in cloud computing even the PC is virtual, which is very different.

Old market leaders like Citrix are making too much profit in legacy VDI contracts to really switch to the cloud. The company cant bring itself to make obsolete its own products and so thats left to some other company -- in the case of Citrix the likely vanquisher is a Silicon Valley startup called Frame, which has been moving companies like Adobe, Autodesk, HP, and Siemens to the cloud.

Citrix, which hired Goldman Sachs earlier this year to help it find a buyer, would probably love to sell itself to Microsoft, but how likely is that given Microsofts absolute commitment to the cloud? Not very.

More:
The cloud computing tidal wave - BetaNews

Read More..

World’s First Quantum Computer Is Here – Wall Street Pit – Wall Street Pit

China has achieved another remarkable feat. Aside from having the worlds fastest supercomputer (TaihuLight), they have now developed a quantum computer that can supposedly work 24,000 times faster than any existing supercomputer, including their own.

While other tech companies like D-Wave and IBM have already managed to build their own quantum computers, what differentiates Chinas quantum computer is the use of multiple photons (the visible particles of light), specifically five of them, which is what gives its computing speed a super boost.

Prior to this, Pan Jianwei the leader of the research team from the University of Science and Technology of China who built the multi-photon quantum computer together with one of his colleagues, Lu Chaoyang, was credited with developing the worlds best semiconductor quantum dots-based single photon source. Using this photon source and an electronically programmable photonic circuit, made it possible for them to build their multi-photon quantum computing device.

According to Pan Jianwei, their machine can do calculations 10 to 100 times faster than ENIAC, the first electronic digital computer built in the 1940s. And while theres no practical use for it yet, its ability to predict the highly complex behavior and movement of photons (something that traditional computers are incapable of doing) is a clear testament to the potential of quantum computers.

Traditional computers store and process information in bits that can represent either 0 or 1. On the other hand, quantum computers use qubits (short for quantum bits) that can represent 0, or 1, or 0 and 1 simultaneously through the quantum concepts superpositioning (being able to exist in two states at once) and entanglement. This is what makes a quantum computer special it can process data and calculate outcomes simultaneously. And the more qubits that can be manipulated, the faster its computing ability becomes.

A common analogy used to explain the concept of quantum computing is reading books in a library. With traditional computing, its like reading one book at a time, finishing one before moving on to another. With quantum computing, its like reading all the books at the same time.

This theory has been around for a while, but researchers have yet to figure out the best approach that can transform this quantum computing dream into a reality.

This is what makes Chinas accomplishment so significant. Their latest quantum device which they are calling a Boson sampling machine isnt just able to perform calculations for five photons; its able to do so at a speed thats 24,000 times faster than what previous proof-of-concept experiments showed. Even better, the team says that scaling up their architecture to a higher number of photons is feasible, adding up to its potential to compete and defeat the performance of classical computers.

According to Shanghai Daily, the team is aiming to be able to manipulate 20 entangled photons by the end of the year. Their research was recently published on Nature Photonics.

Read the original:
World's First Quantum Computer Is Here - Wall Street Pit - Wall Street Pit

Read More..

China adds a quantum computer to high-performance computing arsenal – PCWorld

Thank you

Your message has been sent.

There was an error emailing this page.

China already has the world's fastest supercomputer and has now built a crude quantum computer that could outpace today's PCs and servers.

Quantum computers have already been built by companies like IBM and D-Wave, but Chinese researchers have taken a different approach. They are introducing quantum computing using multiple photons, which could provide a superior way to calculate compared to today's computers.

The Chinese quantum computing architecture allows forfive-photonsampling and entanglement. It's an improvement over previous experiments involving single-photon sourcing, up to 24,000 times faster, the researchers claimed.

The Chinese researchers have built components required for Boson sampling, which has been theorized for a long time and is considered an easy way to build a quantum computer. The architecture built by the Chinese can include a large number of photons, which increases the speed and scale of computing.

China is strengthening its technology arsenal in an effort to be self-sufficient. China's homegrown chip powers TaihuLight, the world's fastest computer.

In 2014, China said it would spend US$150 billion on semiconductor development so that PCs and mobile devices would convert to homegrown chips. Afraid that low-cost Chinese chips will flood the market, the U.S. earlier this year accused China of rigging the semiconductor market to its advantage.

It's not clear yet if a quantum computer is on China's national agenda. But China's rapid progress of technology is worrying countries like the U.S. A superfast quantum computer could enhance the country's progress in areas like weapons development, in which high-performance computers are key.

But there's a long way to go before China builds its first full-fledged quantum computer. The prototype quantum computer is good for specific uses but is not designed to be a universal quantum computer that can run any task.

The research behind quantum computers is gaining steam as PCs and servers reach their limit. It's becoming difficult to shrink chips to smaller geometries, which could upset the cycle of reducing costs of computers while boosting speeds.

If they deliver on their promise, quantum computers will drive computing into the future. They are fundamentally different from computers used today.

Bits on todays computers are stored as ones or zeros, while quantum computers rely on qubits, also called quantum bits. Qubits can achieve various states, including holding a one and zero simultaneously, and those states can multiply.

The parallelism allows qubits to do more calculations simultaneously. However, qubits are considered fragile and highly unstable, and can easily breakdown during entanglement, a technical term for when qubits interact. A breakdown could bring instability to a computing process.

The Chinese quantum computer has a photon device based on quantum dots, demultiplexers, photonic circuits, and detectors.

There are multiple ways to build a quantum computer, including via superconducting qubits, which is the building block for D-Wave Systems' systems. Like the Chinese system, D-Wave's quantum annealing method is another easy way to build a quantum computer but is not considered ideal for a universal quantum computer.

IBM already has a 5-qubit quantum computer that is available via the cloud. It is now chasing a universal quantum computer using superconducting qubitsbut has a different gating model to stabilize systems. Microsoft is trying to chase a new quantum computer based on a new topography and a yet-undiscovered particle called non-abelian anyons.

In a bid to build computers of the future, China has also built a neuromorphic chip called Darwin.

Here is the original post:
China adds a quantum computer to high-performance computing arsenal - PCWorld

Read More..

The Quantum Computer Revolution Is Closer Than You May Think – National Review

Lets make no mistake: The race for a quantum computer is the new arms race.

As Arthur Herman wrote in a recent NRO article, Quantum Cryptography: A Boon for Security, the competition to create the first quantum computer is heating up. The country that develops one first will have the ability to cripple militaries and topple the global economy. To deter such activity, and to ensure our security, the United States must win this new race to the quantum-computer revolution.

Classical computers operate in bits, with each bit being either a 0 or 1. Quantum computers, by contrast, operate in quantum bits, or qubits, which can be both 0 and 1 simultaneously. Therefore, quantum computers can do nearly infinite calculations at once, rather than sequentially. Because of these properties, a single quantum computer could be the master key to hijack our country.

The danger of a quantum computer is its ability to tear through the encryption protecting most of our online data, which means it could wipe out the global financial system or locate weapons of mass destruction. Quantum computers operate much differently from todays classical computers and could crack encryption in less time than it takes to snap ones fingers.

In 2016, 4.2 billion computerized records in the United States were compromised, a staggering 421 percent increase from the prior year. Whats more, foreign countries are stealing encrypted U.S. data and storing it because they know that in roughly a decade, quantum computers will be able to get around the encryption.

Many experts agree that the U.S. still has the advantage in the nascent world of quantum computing, thanks to heavy investment by giants such as Microsoft, Intel, IBM, D-Wave, and Google. Yet with China graduating 4.7 million of its students per year with STEM degrees while the U.S. graduates a little over half a million, how long can the U.S. maintain its lead?

Maybe not for long. Half of the global landmark scientific achievements of 2014 were led by a European consortium and the other half by China, according to a 2015 MIT study. The European Union has made quantum research a flagship project over the next ten years and is committed to investing nearly $1 billion. While the U.S. government allocates about $200 million per year to quantum research, a recent congressional report noted that inconsistent funding has slowed progress.

According to Dr. Chad Rigetti, a former member of IBMs quantum-computing group and now the CEO of Rigetti Computing, computing superiority is fundamental to long-term economic superiority, safety, and security. Our strategy, he continues, has to be viewing quantum computing as a way to regain American superiority in high-performance computing.

Additionally, cyber-policy advisor Tim Polk stated publicly that our edge in quantum technologies is under siege. In fact, China leads in unhackable quantum-enabled satellites and owns the worlds fastest supercomputers.

While quantum computers will lead to astounding breakthroughs in medicine, manufacturing, artificial intelligence, defense, and more, rogue states or actors could use quantum computers for fiercely destructive purposes. Recall the hack of Sony by North Korea, Russian spies hacking Yahoo accounts, and the exposure of 22 million federal Office of Personnel Management records by Chinese hackers.

How can the United States win this race? We must take a multi-pronged approach to guard against the dangers of quantum computers while reaping their benefits. The near-term priority is to implement quantum-cybersecurity solutions, which fully protect against quantum-computer attacks. Solutions can soon be built directly into devices, accessed via the cloud, integrated with online browsers, or implemented alongside existing fiber-optic infrastructure.

Second, the U.S. needs to consider increasing federal research and development and boost incentives for industry and academia to develop technologies that align private interests with national-security interests, since quantum technology will lead to advances in defense and forge deterrent capabilities.

Third, as private companies advance quicker than government agencies, Washington should engage regularly with industry. Not only will policies evolve in a timely manner, but government agencies could become valuable early adopters.

Fourth, translating breakthroughs in the lab to commercial development will require training quantum engineers. Dr. Robert Schoelkopf, director of the Yale Quantum Institute, launched Quantum Circuits, Inc., to bridge this gap and to perform the commercial development of a quantum computer.

The United States achieved the unthinkable when it put a man on the Moon. Creating the first quantum computer will be easier but the consequences if we dont will be far greater.

Idalia Friedson is a research assistant at the Hudson Institute.

Read more:
The Quantum Computer Revolution Is Closer Than You May Think - National Review

Read More..

China builds five qubit quantum computer sampling and will scale to 20 qubits by end of this year and could any beat … – Next Big Future

Chinese researchers have built a 10 qubit quantum computer.

China builds ten qubit quantum computer, They will scale to 20 qubits by end of this year and could beat the performance of any regular computer next year with a 30 qubit system.

A chinese research team led by Pan Jianwei is exploring three technical routes to quantum computers: 1. systems based on single photons, 2. ultra-cold atoms and 3. superconducting circuits.

Experimental set-up for multiphoton boson-sampling. The set-up includes four key parts: the single-photon device, demultiplexers, ultra-low-loss photonic circuit and detectors. The single-photon device is a single InAs/GaAs quantum dot coupled to a 2-m-diameter micropillar cavity

Pan Jianwei and his colleagues Lu Chaoyang and Zhu Xiaobo, of the University of Science and Technology of China, and Wang Haohua, of Zhejiang University set two international records in quantum control of the maximal numbers of entangled photonic quantum bits And entangled superconducting quantum bits.

Pan doubling that manipulation of multi-particle entanglement is the core of quantum computing technology and has been the focus of international competition in quantum computing research.

In the photonic system, his team has made the first 5, 6, 8 and 10 entangled photons in the world and is at the forefront of global developments.

Last year, Pan and Lu Chaoyang developed the worlds best single photon source based on semiconductor quantum dots. Now, they are using the high-performance single photon source and electronically programmable photonic circuit to build a multi-photon quantum computing prototype to run the Boson Sampling task.

The Chinese photonic computer is 10 to 100 times faster than the first electronic computer, ENIAC, and the first transistor computer, TRADIC, in running the classical algorithm.

The Hefei reporter quantum device, called a boson sampling machine, can now carry out calculations for five photons, but at a speed 24,000 times than previous experiments.

ENIAC contained 17,468 vacuum tubes, 7200 crystal diodes, 1500 relays, 70,000 resistors, 10,000 capacitors and approximately 5,000,000 hand-soldered joints. It could perform 5000 simple addition or subtraction operations per second. ENIAC could perform 500 floating point operations per second.

The Chinese team led by Pan, Zhu Xiaobo and Wang Haohua have broken that record. They dependent developed a superconducting quantum circuit containing 10 superconducting quantum bits and successfully entangled the 10 quantum bits through a global quantum operation.

Nature Photonics High-efficiency multiphoton boson sampling

They will try to design and manipulate 20 superconducting quantum bits by the end of the year. They also plan to launch a quantum cloud computing platform by the end of this year.

Our architecture is feasible to be scaled up to a larger number of photons and with a higher rate to race against increasingly advanced computers, they said in the research paper.

Professor Scott Aaronson, who is based at the University of Texas at Austin and proposed the idea of the boson sampling machine, questioned whether it was useful to compare the latest results with technology developed over 60 years ago, but he said the research had shown Exciting experimental progress .

Its a step towards boson sampling with say 30 photons or some number thats large enough that no one will have to squint or argue about whether a quantum advantage has been attained, he said.

Araronson said one of the main purposes of making boson sampling machines was to prove that quantum devices could be shown to have an advantage in one area of complex calculations over existing types of computer.

Doing so would answer the quantum computing sceptics and help pave the way towards universal quantum computation, he said.

Abstract

Boson sampling is considered as a strong candidate to demonstrate quantum computational supremacy over classical computers. However, previous proof-of-principle experiments suffered from small photon number and low sampling rates owing to the inefficiencies of the single-photon sources and multiport optical interferometers. Here, we develop two central components for high-performance boson sampling: robust multiphoton interferometers with 99% transmission rate and actively demultiplexed single-photon sources based on a quantum dotmicropillar with simultaneously high efficiency, purity and indistinguishability. We implement and validate three-, four- and five-photon boson sampling, and achieve sampling rates of 4.96kHz, 151Hz and 4Hz, respectively, which are over 24,000 times faster than previous experiments. Our architecture can be scaled up for a larger number of photons and with higher sampling rates to compete with classical computers, and might provide experimental evidence against the extended ChurchTuring thesis.

18 pages of supplemental material

View post:
China builds five qubit quantum computer sampling and will scale to 20 qubits by end of this year and could any beat ... - Next Big Future

Read More..

From the Desk of Jay Fallis: To internet vote, or not to internet vote – BarrieToday

On occasion, when I forget my padlock at home, I have left my gym bag unlocked in the change room. In these instances, I take comfort knowing that most people in my community are honest and would not take my belongings. As of yet, I have not been stolen from. However, continuing this practice might one day have consequences.

This scenario is analogous of implementing internet voting in municipal elections. By exposing our democratic process through online ballot casting and tabulation, we would be at risk of manipulation on a large scale. Our one protection against such criminal action is based on the fact that most of those in our community would not willingly compromise our democratic system.

Over the past few years, many municipalities in the Barrie area have adopted or considered adopting online voting methods. While Barrie itself continues to use electronic tabulators instead of internet voting, Innisfil, Springwater, Oro-Medonte, and Penetanguishene will offer online voting in 2018 municipal elections. This past Monday, Orillia almost followed suit to permit internet voting. However, the Orillia city council narrowly defeated the proposal.

In the days leading up to the vote in Orillia, I talked with Councilor Mason Ainsworth to get a sense for this debate. Ainsworth has been a staunch opponent of internet voting. We started by talking about whether turnout rates might be affected by online voting.

There are a lot of studies out there in regards to the voter turnout of online voting and pretty much all of them say that it doesnt increase turnout. This makes sense because really youre getting the same people who are voting either way.

While there are particular instances where turnout has improved after the implementation of online voting, most studies suggest that online voting does not have the capacity to increase voter turnout. Although access to the polls is improved using these alternative methods, generally speaking, those in the past who do not participate in municipal elections will continue these tendencies under a new voting system.

Additionally, there are significant drawbacks associated with online voting. The first that Ainsworth talked about was voter fraud.

In regards to fraud were not sure, and staff has openly said in the [council] meeting that they wouldnt know if this happened: If somebody else is voting for somebody, he said.

Ainsworth went on to suggest that online voting methods could allow a member of a household to vote on the behalf of another, and that such actions could be carried out with bad intentions. According to Ainsworth, our current system guards against this problem, by obligating voters to mark their ballot in privacy.

Internet hacking was also a consideration for Ainsworth. After being asked about the security measures taken by the Orillia city council, he felt that the plan in place would not be adequate to prevent hacking.

Do we have a whole internet security staff group at City Hall? We dont. We have a couple folks in IT; but we dont have folks who are specifically there to make sure all our stuff is secure.

Although, to date, no Ontario municipalities experimenting with online voting have been hacked, the potential exists, especially without adequate security measures in place. Along with these concerns, Ainsley suggested that there are also problems around the capacity of hackers.

I was reading an article the other day, there was a [sixteen-year-old] student and he hacked Microsoft and Sony, which are two giant major corporations, he said.

This scenario is not a one off. In our conversation, we discussed just a few of the major institutions and elections that had been hacked: the Pentagon, Bitcoin, the 2012 Federal NDP leadership race, and the American Democratic Party. Internet hacking can happen to any institution, no matter the security system in place.

While there could be some benefits to online voting such as access for voters and potential savings for municipalities, there are also risks of large-scale manipulation. Perhaps, as Ainsworth suggested, it is best to wait until adequate security technology for internet voting becomes available down the road. In the meantime, municipalities would be better off experimenting with other alternative voting methods to improve turnout and convenience.

See more here:
From the Desk of Jay Fallis: To internet vote, or not to internet vote - BarrieToday

Read More..

Are Blockchains Key to the Future of Web Encryption? – CoinDesk

Encrypted websites now handle more than half the world's web traffic, but the way the keys for those connections are exchanged and verified hasn't changed much in 20 years.

The current system relies on a global network of certificate authorities (CAs) to verify the public key and the owner of each secure website. It has long been criticized for creating central points of failure. And those central points, the CAs, have actually failed in some cases.

Some think blockchains the technology that manages key exchange for the $25bn bitcoin network could be the basis for a secure alternative.

Like blockchains, CAs began as a way to facilitate connected commerce. Veteran developer Christopher Allen who helped set up the first certificate authority, VeriSign said he imagined a system with several CAs where users would pick which ones to trust.

As the system has scaled, however, it's become impractical for everyday users to actively manage their trust in different authorities. Most now rely on their browser's default settings instead. Its now the browser companies that effectively control trust, giving them huge clout within the certificate industry.

"We've got a new centrality, which is the big browser companies," said Allen.

While control over trust has centralized, the number of certificate authorities has grown. There now hundreds of authorities in countries around the world, and a failure at any one of them undermines the whole system.

The worst incident to date was the collapse of the Dutch authority DigiNotar in 2011. Hacking DigiNotar allowed attackers to spy on around 300,000 Iranian Gmail accounts, and forced a temporary shut down of many of the Dutch government's online services.

Since then, there have been dozens of cases where CAs were caught issuing unverified certificates, using substandard security, or even trying to deceive browser companies. None of these had the same effects as DigiNotar, and the industry has raised security standards many times since 2011, but there are still those who think its time to look for a long-term alternative to CAs.

One of those alternatives was outlined in a 2015 white paper, written at a workshop Allen hosted called "Rebooting Web of Trust". The paper set out goals for a decentralized public key infrastructure (dpki) to replace the current, centralized system.

It reads:

"The goal of dpki is to ensure that ... no single third-party can compromise the integrity and security of the system as as whole."

In place of the current system, where domain ownership is recorded in the DNS and key are verified by CAs, Rebooting Web of Trust envisioned a secure namespace where domain registration and the key for each domain would be recorded on a blockchain.

The Ethereum Name System (ENS) is trying to create the same kind of secure namespace for the ethereum community. It gives us a first look at the challenges and opportunities of making these ideas work in practice.

Developer Alex Van de Sande said his team often uses the analogy of a sandwich to explain how ENS is designed. The 'bread' in the ENS sandwich are two simple contracts. One stipulates that if you own the domain, you're entitled to its subdomains. The other handles payments.

Like in a sandwich, the complicated part of ENS is in the middle. Thats the contract that sets the rules for name registration. ENS wants to avoid the problem of domain squatting, which was common during the initial internet domain name boom.

Theyre also pursuing the 'principle of least surprise', the idea that people shouldnt be too surprised by who actually owns a name. It might seem like common sense that Bank of America should have first dibs on bankofamerica.eth. But Van de Sande said that designing a system to implement that principle is very challenging, maybe even impractical.

He added thatENS will take the first year after the relaunch as an opportunity to learn how to improve the registration rules. If the rules change, he said, name owners will have a choice to upgrade or surrender their names for a refund.

Van de Sande said he hopes ENS will be a model for a wider use of similar ideas, adding:

"ENS reflects the way we wish the internet would be. It doesn't mean that it's actually going to be that way."

Another way to decentralize the infrastructure behind secure online communication is to ensure that users can verify the actual information they receive, rather than trying to secure the server-client connection.

Engineer Jude Nelson, who collaborated on the 2015 "Rebooting Web of Trust" white paper, told CoinDesk this is the goal of his startup, New York-based Blockstack.

Blockstack's system, which is currently in an alpha release, allows users to record their unique name and key on the bitcoin blockchain, and then lookup another user in order to verify the information they receive.

"With Blockstack, we're trying to make it so that developers can build server-less, decentralized, applications where users own their own data," said Nelson. "There are no passwords and developers don't have to host either of them."

This could, one day, reduce the need for the website encryption altogether.

Each of these projects reflects the same overarching goal: to reduce the role of third parties and give users more control.

Allen, who has convened the Rebooting Web of Trust group every six months since 2015, said he is working towards technologies that giveusers true sovereignty.

The many strings of letters and numbers that represent individuals online today are all registered with third parties. "You're not really buying it, you're renting it. You don't have true sovereignty," said Allen.

But Allen also sees many challenges ahead. One is usability. Systems that work for technically adept users may not scale to applications where most users will rely on defaults and wont be prepared to make choices about who to trust.

Allen said:

Weve learned in technology that giving users choice often doesnt work."

Meanwhile, the centralized system is also changing. Google is in the middle of rolling out its own solution to the pitfalls of the CA system a plan called Certificate Transparency, which requires CAs to log all trusted certificates in public view.

Google said it can verify log-inclusion and the log's honesty with Merkle trees, and the system has already allowed researchers to catch some bad certificates.

Googles idea is to keep the third party, but remove the trust. And this approach may prove to be a long-term competitor to blockchain-based projects which want to get rid of both.

Encryption machineimage via Shutterstock

DomainsBlockstackBrowsersENS

Here is the original post:
Are Blockchains Key to the Future of Web Encryption? - CoinDesk

Read More..

How to Enable AES Encryption on Your Router – Laptop Mag

Encryption is a hotly-debated topic in Washington, but an essential part of web security everywhere else. For most of us, this starts with how were connecting to the internet -- our router. Each router has multiple encryption settings, of which AES/WPA2 is the hands-down winner when it comes to keeping each of us secure. Dated technologies, like WEP, arent much better than no protection at all, and sadly some older routers are set to that by default. Firmware updates should bring you into modern times, but you may still have to know what youre looking for.

Heres how its done, at least on my router.

1. In the address bar, enter the home address for your router and press enter. For mine, its 192.168.1.1, but depending on your router it could be 192.168.0.1, or another variation. If its neither of those, a quick Google search for [router brand] home address will get you where you need to be.

2. Log in, and press OK to proceed.

3. Click Wireless settings at the top of the page -- or something similar on your router.

4. Click Basic Security Settings -- or, just security settings or something similar.

5. Under Wi-Fi Security, select WPA2. WPA2 utilizes AES encryption, which is typically plenty for most households.

6. Click Apply at the bottom.

Original post:
How to Enable AES Encryption on Your Router - Laptop Mag

Read More..