Thursday, October 31, 2019

Data Dimensions introduces PANOPTIC, its agnostic, end-to-end medical bills processing platform

Data Dimensions launched Thursday PANOPTIC, its complete processing platform built for the auto casualty and work compensation markets.

The PANOPTIC platform is completely agnostic, integrating with any bill review company, making it a versatile asset to insurers. With features such as provider credentialing, fraud and abuse detection, bill screening, filtering and routing, PANOPTIC is a comprehensive cost containment tool. The platform can also identify and help to prevent provider network leakage.


The PANOPTIC platform has the ability to receive claims documentation, medical bills and their attachments in any format (Paper mail, Fax, email, eBill, direct upload); an Intelligent Business Rules Engine applies ICD-9/ICD-10 validation, medical bills filtering, provider screening and claimant eligibility before routing the bills to the appropriate network, payor or bill review platform.


In addition, PANOPTIC helps insurers to meet Insurance Diversity Initiative (IDI) goals, pay providers and non-providers electronically, and provides actionable data through analytics and reporting. 


Artificial Intelligence is applied throughout the platform for maximum accuracy and efficiency, and its proprietary tracking and monitoring component gives visibility at every stage of the process.

Intel Xeon E-2200 processors deliver additional layer of hardware-based security and manageability

Intel announced Thursday general availability of Intel Xeon E-2200 processors taking another step forward in data center security. Popular for small-medium business deployments, Intel Xeon E processors are also driving enhanced security usages with the additional layer of hardware-based security and manageability made possible by Intel Software Guard Extensions (Intel SGX). 

A key priority at Intel is enabling features that will help protect sensitive customer data – and Intel SGX does just that.

The new 8-core Intel Xeon E-2200 processors enable servers to operate at frequencies reaching up to 5.0 GHz (with Intel Turbo Boost Technology 2.0) and feature expanded capacity for hardware-enhanced security with double the Intel SGX Enclave Page Cache (EPC), now 256MB, and side-channel mitigations in hardware. 



Intel invests heavily in security, and the larger enclave sizes enable larger code and datasets to be encrypted in the SGX enclave, expanding the usages of Intel SGX, and paving the way for additional data center security innovations like AI architectures including federated learning.

Federated Learning is a machine learning paradigm where many compute systems are “federated” together to analyze large and/or diverse datasets. However, current approaches to AI can require complex webs of trust, where the data or the algorithm could be exposed to an untrusted party. Trusted Execution Environments (TEEs) such as Intel SGX provide a means for processing the data within protected enclaves.


This facilitates the advantages of cross-industry machine learning while still helping to maintain the privacy of individual data and the confidentiality of proprietary algorithms. Rival banks could build joint anti-money laundering models. Hospitals could use remote, 3rd party analytics on patient data. Retailers could monetize their purchase data while keeping a focus on user privacy.

Federated Learning is an example of new security innovations that Intel SGX can enable. Intel’s ecosystem partners bring new ideas to the table constantly with customer data protection as a top priority. Microsoft has been at the forefront of confidential computing in the cloud with Azure. 

“The new Intel Xeon E-2200 processor unlocks additional enclave space which opens up new scenarios and improves performance. Microsoft plans to roll-out Xeon E-2200 based confidential computing clusters in UK South and Canada within the first quarter of 2020,” said Scott Woodgate, Azure Security, Microsoft.


Confidential Computing is an emerging industry initiative focused on securing data in-use, especially in multi-tenant cloud environments where the goal is to keep sensitive data isolated from all other privileged portions of the system stack. 

Intel SGX plays a large role in making this capability a reality, both at our own company and throughout the industry. As computing moves to span multiple environments from on-prem to public cloud to edge, it is no wonder companies are looking for protection controls that help to safeguard sensitive IP and workload data wherever their data resides.

Intel is also making investments in the ecosystem like joining the Confidential Computing Consortium and contributing the Intel SGX Software Development Kit to support a broad industry push to address the latest frontier for data confidentiality in the cloud.

Nasuni announces latest release of its file services platform for modernizing NAS infrastructure

Nasuni Corp. announced Wednesday latest release of its file services platform for modernizing network attached storage (NAS) infrastructure. Nasuni’s new version gives enterprises access to artificial intelligence (AI) and search analytics for unstructured data, as well as powerful cloud migration capabilities. Companies gain a springboard for agile, intelligent migrations and cloud-first approaches, as well multi-cloud flexibility. 

Unstructured data has exploded, while globalization and agility pressures have made traditional NAS infrastructures inadequate. Businesses should no longer build file server silos, which can be costly, complex and unable to keep up with prevailing demands. 

Powered by file system, Nasuni delivers a file services platform built for the cloud that combines performance of local file servers with the scalability and durability of cloud storage, at roughly half the cost of traditional file infrastructures. 


The Nasuni Analytics Connector allows customers to turn unstructured data into big data. A consolidated cloud-based file system enables customers to export a temporary second copy of their file data to use with analytics software, AI, machine learning and other data recognition tools such as AWS Rekognition and Macie. This new release also features support for leading search software, including SharePoint Search, Acronis FilesConnect, Cloudtenna, Search Blox, Graymeta, and NeoFinder.

Customers can start their journey from traditional on-premises NAS to the cloud with the Nasuni AWS Cloud Migration Services and Nasuni Azure Cloud Migration Services. Nasuni’s services for AWS now include both Amazon Snowball and the Nasuni Cloud Migrator for AWS. Nasuni’s services for Azure now include Microsoft’s Data Box and Nasuni’s Cloud Migrator for Azure. These new services enable customers to seamlessly move data to Amazon S3 or Microsoft Azure storage faster and with less effort than a self-driven cloud migration, enabling a successful cloud-first strategy.

In addition to support for AWS and Azure, new support for Google Cloud Storage extends Nasuni’s public cloud storage capabilities to all three major public cloud providers, enabling customers to implement multi-cloud strategies based on their specific business or application requirements. This new release now supports more private cloud storage solutions, including NetApp StorageGRID, Nutanix Objects, IBM COS, Hitachi Vantara HCP, and Scality RING.


Users can migrate NAS silos to the cloud storage of their choice for on-demand capacity expansion, built-in backup, instant disaster recovery, and multi-site file sharing. They also gain sophisticated data insights and accelerate their cloud storage strategies with Nasuni’s ability to scale and foster collaboration across distributed workforces.

Nasuni enables enterprises to seamlessly navigate their entire journey from NAS consolidation to workforce productivity. The Nasuni platform is trusted by many of the largest organizations in retail, consumer goods, creative services, oil and gas, architecture, engineering, construction and manufacturing. The company is experiencing record growth, with a 250 percent increase in data under management over the last 24 months. 


Nasuni is now deployed in more than 7,000 locations in 70 countries. Nasuni’s growth reflects its customers’ needs to simplify how critical file data is stored and protected, while empowering users to share and collaborate on files across multiple sites and geographies with maximum performance and reliability.

Nasuni’s new enhancements will be available by the end of the year. The Nasuni platform comes in three editions -- Essentials, Advanced and Premium.

phoenixNAP boosts capabilities of its data security cloud offering to allow for greater storage and performance efficiencies

phoenixNAP announced new delivery models and architecture enhancements of its Data Security Cloud (DSC) solution. Leveraging technologies, phoenixNAP enabled greater agility of its secure cloud environment while further improving on its performance, backup, and management capabilities. 

Data Security Cloud is phoenixNAP’s secure-hosted multi-tenant platform that uses strict virtualization and segmentation controls on powerful hardware to provide robust platform for sensitive workloads. 

The enhancements in the newest version include a new delivery model enabling organizations to choose between DSC Essentials and DSC Advanced platform. In addition to this, the solution will include a more flexible backup storage tiering option and improved threat management controls. 



Data Security Cloud is built for sensitive workloads to allow organizations operating under strict security and compliance standards to meet their goals. Based on VMware NSX, it provides advanced security options through microsegmentation and workload mobility options. Data Security Cloud users benefit from vSAN as it ensures lower latency and greater cost-efficiency. 

Leveraging Veeam backup technology, the improved Data Security Cloud includes cloud backups to the amount of 50 or 100 per cent of total storage volume, depending on the chosen plan. Both DSC Essentials and Advanced provide threat management capabilities such as 24/7 SOC, endpoint protection, threat-risk intelligence, anti-virus and vulnerability scanning, as well as vulnerability remediation and intelligence, behavioral analytics, and security posture report. 

The enhanced Data Security Cloud also includes all-flash vSAN by VMware, which provides greater flexibility in terms of performance storage scaling. VMware vSAN is a Software-Defined Data Center (SDDC) solution and a pioneer of hyper-converged infrastructure (HCI). Integrated directly into DSC, it helps deliver intensive applications with lower latency and greater cost-effectiveness. 

In addition to this, it provides an efficient performance storage option that is integrated into existing virtualized infrastructures, eliminating the need to purchase a traditional storage system and allowing for greater agility. As such, the solution is particularly convenient for demanding workloads such as gaming applications, high-transaction databases, parallel distributed file systems, and mission-critical applications. 


“The release of the enhanced Data Security Cloud is a new milestone in our journey towards the secure cloud,” said Ian McClarty, President at phoenixNAP. “The platform just got stronger and more robust, providing even greater protection for our clients’ critical workloads. We are excited to present DSC Essentials and DSC Advanced to the world and provide organizations with a powerful security solution. Data Security Cloud is built to withstand any disaster scenario or security breach while the new capabilities make it easier to deploy and manage.” 

“The enhancements to Data Security Cloud are designed to provide organizations with more flexibility and performance,” said Jim Aluotto, senior director, Cloud Provider Business, Americas Region, VMware. “VMware Cloud Providers such as phoenixNAP are empowering organizations of different sizes with a simple and flexible path to the cloud that supports their growth, performance and security goals.”

“The all-flash vSAN technology significantly expands the capabilities of our Data Security Cloud,” said William Bell, executive vice president of Products at phoenixNAP. “Using SSD flash disks for both cache tier and Capacity tier, it enables easier management of storage resources while ensuring advanced performance. With that capability inside of Data Security Cloud, our customers can maximize the use of their infrastructure. Coupled with a more flexible storage tiering option and improved resource management controls, all-flash vSAN provides a more powerful cloud platform for sensitive and regulated workloads.”

Tripp Lite expands offerings available on BIMObject platform; helps architects, engineers design efficient data centers

Tripp Lite, manufacturer of power protection and connectivity solutions, announced on Wednesday that it has expanded its IT infrastructure solutions available on the BIMObject platform, providing additional design tools for data center architects.

The BIMObject platform is a growing digital content management system for building information modeling (BIM) objects. The cloud-based platform provides architects, builders, engineers and interior designers with robust and accurate 3D representations of components used in building design. 

It also integrates with computer-aided design and machine execution systems and helps designers facilitate collaboration and decision-making, raise efficiencies and cut costs.


The breadth of Tripp Lite solutions available on the platform now includes wall-mount and floor-standing rack enclosures, UPS systems, PDUs, power strips and KVM switches.

Tripp Lite BIM files integrate into compatible three-dimensional design software such as 3ds, ArchiCAD, AutoCAD, ifc, PDF, Revit and SketchUp.

“Including products in the Tripp Lite BIM catalog helps data center designers conceptualize and render realistic, space-efficient and cost-effective plans,” said Tony Locker, Tripp Lite’s vice president of product management for enterprise solutions. “By developing BIM files of products, data center designers are able to incorporate photo-real drawings of IT infrastructure solutions with confidence that the items’ geometry and appearance are represented accurately within the plans.”

Cray debuts ClusterStor E1000 storage to power data-driven workloads including AI, analytics, simulation and modeling

Cray, a Hewlett Packard Enterprise company unveiled on Wednesday its Cray ClusterStor E1000 system, a new parallel storage platform for the exascale era. ClusterStor E1000 addresses the explosive growth of data from converged workloads and the need to access that data at improved speed, by offering an optimal balance of storage performance, efficiency and scalability, effectively eliminating job pipeline congestion caused by I/O bottlenecks. 

Cray ClusterStor E1000 systems will be available starting in the first quarter of next year. 


The next-generation global file storage system has already been selected by the US Department of Energy (DOE) for use at the Argonne Leadership Computing Facility, Oak Ridge National Laboratory and Lawrence Livermore National Laboratory, where the first three US exascale supercomputers will be housed (respectively Aurora, Frontier and El Capitan).


With the introduction of the ClusterStor E1000 storage system, Cray has completed the re-architecture of its end-to-end infrastructure portfolio, which encompasses Cray Shasta supercomputers, Cray Slingshot interconnect, and the Cray software platform. With Cray’s next-generation end-to-end supercomputing architecture, available for any datacenter environment, customers around the world can unleash the full potential of their data.


Recognizing data access challenges presented by the exascale era, Cray’s ClusterStor E1000 enables organizations to achieve their research missions and business objectives faster. ClusterStor E1000 systems can deliver up to 1.6 terabytes per second and up to 50 million I/O operations per second per rack – more than double compared to other parallel storage systems in the market today.

The new system also deliver purpose-engineered end-to-end PCIe 4.0 storage controllers serve the maximum performance of the underlying storage media to the compute nodes and new intelligent Cray software, ClusterStor Data Services, allows customers to align the data flow with their specific workflow, meaning they can place the application data at the right time on the right storage media (SSD pool or HDD pool) in the file system.
An entry-level system starts at 30 gigabytes per second and at less than 60 terabytes usable capacity. Customers can start at the size dictated by their current needs and scale as those needs grow, with maximum architectural headroom for future growth. 


The ClusterStor E1000 storage system can connect to any HPC compute system that supports high speed networks like 200 Gbps Cray Slingshot, Infiniband EDR/HDR and 100/200 Gbps Ethernet.

OnApp releases Cloud Platform-as-a-Service that slashes time, cost and complexity of running own cloud platform

OnApp launched Wednesday Cloud Platform-as-a-Service (CPaaS), a new way for service providers and enterprises to get a ready-to-use cloud platform. 

OnApp CPaaS gives companies a complete cloud platform for running their own public, private or hybrid cloud services, without the cost and complexity of having to build, manage and support the underlying hardware and software infrastructure. 



The whole cloud platform is provided as a managed service. CPaaS is built with the OnApp Cloud platform, which automates all of the virtualization, software-defined storage and networking, provisioning, RBAC, template, pricing, backup and management functions that let users deliver public, private or hybrid cloud services. Everything is controlled through an intuitive, rebrandable control panel.


OnApp CPaaS offers automated deployment and configuration of the OnApp cloud management platform using hyperscale cloud infrastructure at AWS, datacenters from OVH, or the global network of OnApp clouds. 



The OnApp cloud platform combines virtualization, software-defined storage and networking, cloud orchestration, cloud management, metering and billing in one white label self-service portal. It also enables easy migration of workloads between different types of cloud infrastructure; with CPaaS, customers can take advantage of hyperscale performance without locking themselves into a single ecosystem.  



Three different CPaaS packages are available at launch. Each includes a full managed service package and an OnApp cloud control panel deployed at a customer's choice of AWS location. 


Customers then choose compute infrastructure according to their use case and price/performance needs CPaaS Light provides compute infrastructure from the cloud marketplace built into the OnApp cloud platform, with more than 40 locations available; CPaaS Pro offers dedicated compute infrastructure hosted at OVH, with a range of server sizes and locations available; and CPaaS Hyper provides compute infrastructure using bare metal servers at AWS, for the most demanding workloads. 



Each package is available now from OnApp. In future, customers will be able to order CPaaS Light and Hyper through a self-service cloud builder at cloud.net, for delivery in as little as two hours. OnApp plans to expand its range of CPaaS services in the future, providing similarly rapid delivery of other ready-to-use, ready-to-sell services.

Datameer releases Neebo cloud-native virtual analytics hub to discover, share, and collaborate on analytics and data science assets

Datameer introduced on Wednesday Neebo, its product that enables analytics and data science teams to find, combine, and publish trusted information assets in hybrid landscapes. 

Neebo's self-service platform enables analytics professionals and data scientists to initiate projects in minutes and promptly answer analytics questions or build new models, thereby enabling greater business agility. Neebo provides a unified access point for analysts, data scientists, and business stakeholders to more effectively leverage all their data science and analytics assets across the enterprise. 

Neebo works with information assets of any type such as data, documents, reports, code, dashboards, SaaS applications, and data science models, no matter where they reside: on-premises, in the cloud, in SaaS applications, or in web services.



Neebo uses virtualization and AI techniques to support a number of key capabilities. With Neebo, teams can connect to analytics and data science assets and use them no matter where they reside; find and explore these assets to help answer analytics questions; combine assets to create new ones that are customized to solve the problem at hand; publish assets that can be consumed by business intelligence and data science tools, and share and collaborate to re-use assets and build trust and knowledge across the enterprise.

By providing virtualized access to analytics assets, Neebo eliminates costly and error-prone movement, keeps assets securely in place, and ensures trusted, single source of truth for each asset. Neebo also includes security and governance capabilities that complement and integrate into existing frameworks.


Neebo embeds AI in many of its features including: assisting with discovery of assets so teams can find the optimal ones for their specific problems, providing data blend suggestions, and optimizing queries and caching. All this makes the work of analysts and data scientists easier and faster.

Neebo manages a wide variety of assets covering both traditional analytics and data science. This helps organizations unify all their analytics efforts and integrates data science initiatives into mainstream analytics processes and governance.


In September, Datameer announced general availability of Datameer X, a new release of its data preparation and exploration software built for data scientists and machine learning engineers. Now, organizations can speed up machine learning analytics cycles and create robust data flows that feed more data into machine learning models to increase their accuracy.

Masimo secures FDA clearance for neonatal RD SET Pulse Oximetry sensors with improved accuracy specifications

Masimo announced that RD SET sensors with Masimo Measure-through Motion and Low Perfusion SET pulse oximetry have received FDA clearance ...