storage system

Results 76 - 100 of 339Sort Results By: Published Date | Title | Company Name
Published By: NetApp     Published Date: Dec 09, 2014
NetApp Flash Pool is a storage cache option within the NetApp Virtual Storage Tier product family, available for NetApp FAS storage systems. A Flash Pool configures solid state drives (SSDs) and hard disk drives (HDDs) into a single storage pool, known as an “aggregate” in NetApp parlance, with the SSDs providing a fast response time cache for volumes that are provisioned on the Flash Pool aggregate.
Tags : 
netapp, hybrid, flash pool, ssd, hdd, iops, oltp, demartek, it management
    
NetApp
Published By: NetApp     Published Date: Sep 21, 2017
This brief explains SolidFire's solution to the modern data center's struggle that exists between applications and storage, and how it offers a distinct scale-up storage application that intelligently dictates system performance in real time.
Tags : 
netapp, database performance, flash storage, data management, cost challenges, solidfire
    
NetApp
Published By: NetApp     Published Date: Sep 21, 2017
In order to achieve the five nines (aka, 99.999%), or higher availability, it is important to take a closer look at the design of your storage architecture systems. This paper offers a comprehensive method of assessing your current infrastructure, and provides insight into ways to transform your design to drive better outcomes.
Tags : 
netapp, database performance, flash storage, data management, cost challenges, solidfire
    
NetApp
Published By: Dell EMC     Published Date: May 15, 2015
This IDC study provides detailed insights into the rapidly growing market for enterprise storage systems that leverage flash storage media. This study segments this market into the following two technology segments: AFAs and HFAs. Insights into these two market segments are provided historically. Market shares and IDC analysis are provided for each of the top vendors in these two segments along with detailed market-level analysis.
Tags : 
emc, storage all-flash, architectures, workload, data center, scale, storage, hybrid, vendor profiles, data management
    
Dell EMC
Published By: Cohesity     Published Date: Apr 24, 2018
As organizational needs change and workloads become increasingly distributed, a key realization is emerging: traditional approaches to backup and recovery may no longer be sufficient for many organizations. These companies may have discovered that their existing tools are not keeping pace with other advancements in their computing environments, such as scale-out storage systems and hyperconverged systems, which seek to reduce data center complexity and help manage surging storage costs.
Tags : 
    
Cohesity
Published By: Cohesity     Published Date: Oct 02, 2018
The University of California, Santa Barbara (UCSB) is a public research university and one of the 10 campuses of the University of California system. Its secondary storage was a combination of multiple point solutions. The UI/setup and maintenance was complex. Maintaining multiple licensing and maintenance agreements negatively impacted the administrative cost. The skyrocketing cost for additional backup capacity limited the team’s ability to expand their backup protection to many critical systems. With Cohesity's unified hyperconverged secondary storage platform, the IT team provided a single solution for all 13 departments to consolidate their backups on one platform, and scale-out as required. Read the case study and get details on how UCSB consolidated everything from backup to recovery, analytics to monitoring and alerting.
Tags : 
case, study, ucsb, consolidates, data, protection, cohesity, hyperconverged
    
Cohesity
Published By: Amazon Web Services     Published Date: Oct 09, 2017
Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. Data Lakes are a new and increasingly popular way to store and analyse data that addresses many of these challenges. Data Lakes allow an organization to store all of their data, structured and unstructured, in one, centralized repository.
Tags : 
cost effective, data storage, data collection, security, compliance, platform, big data, it resources
    
Amazon Web Services
Published By: Pure Storage     Published Date: Jul 18, 2017
OTS' experiences with all-flash technology mirror what many customers have experienced over the past several years. Hesitatingly optimistic, OTS deployed its first AFA in a limited environment to prove the system out, and when it performed and/or exceeded the company's expectations, it quickly started moving more workloads to it over time. Like OTS, many customers do not immediately retire their legacy storage, but in moving the more demanding workloads off of their existing environments, they provide better application performance both for those applications on the AFA and for those that stay on the legacy storage. And like OTS, the majority of AFA customers have in the past year in particular come to the conclusion that all-flash technology is the future of primary storage for performance, reliability, ease of management and expansion, and economic (as measured by total cost of ownership) reasons.
Tags : 
external storage systems, flash arrays, scale-up, scale-out, managing it infrastructure, data center, idc, market intelligence
    
Pure Storage
Published By: Teradata     Published Date: May 02, 2017
Should the data warehouse be deployed on the cloud? Read this IDC Research Spotlight to learn more.
Tags : 
data warehouse, data storage, data management, data analytics, data preparation, data integration, system integration
    
Teradata
Published By: Pure Storage     Published Date: Jul 26, 2017
OTS' experiences with all-flash technology mirror what many customers have experienced over the past several years. Hesitatingly optimistic, OTS deployed its first AFA in a limited environment to prove the system out, and when it performed and/or exceeded the company's expectations, it quickly started moving more workloads to it over time. Like OTS, many customers do not immediately retire their legacy storage, but in moving the more demanding workloads off of their existing environments, they provide better application performance both for those applications on the AFA and for those that stay on the legacy storage. And like OTS, the majority of AFA customers have in the past year in particular come to the conclusion that all-flash technology is the future of primary storage for performance, reliability, ease of management and expansion, and economic (as measured by total cost of ownership) reasons.
Tags : 
external storage systems, flash arrays, scale-up, scale-out, managing it infrastructure, data center, idc, market intelligence
    
Pure Storage
Published By: Pure Storage     Published Date: Apr 10, 2019
Deep learning opens up new worlds of possibility in artificial intelligence, enabled by advances in computational capacity, the explosion in data, and the advent of deep neural networks. But data is evolving quickly and legacy storage systems are not keeping up. Advanced AI applications require a modern all-flash storage infrastructure that is built specifically to work with high-powered analytics.
Tags : 
    
Pure Storage
Published By: Pure Storage     Published Date: Apr 10, 2019
This document describes the technical reasons for and benefits of an end-to-end training system and why the Pure Storage® FlashBlade™ product is an essential platform. It also shows performance benchmarks based on a system that combines the NVIDIA® DGX-1™, a multi-GPU server purpose-built for deep learning applications and Pure FlashBlade, a scale out, high performance, dynamic data hub for the entire AI data pipeline.
Tags : 
    
Pure Storage
Published By: NetApp     Published Date: Feb 19, 2015
NetApp Flash Pool is a storage cache option within the NetApp Virtual Storage Tier product family, available for NetApp FAS storage systems. A Flash Pool configures solid state drives (SSDs) and hard disk drives (HDDs) into a single storage pool, known as an “aggregate” in NetApp parlance, with the SSDs providing a fast response time cache for volumes that are provisioned on the Flash Pool aggregate. In this lab evaluation, NetApp commissioned Demartek to evaluate the effectiveness of Flash Pool with different types and numbers of hard disk drives using an online transaction processing (OLTP) database workload, and to evaluate the performance of Flash Pool in a clustered Data ONTAP environment during a cluster storage node failover scenario. In the report, you’ll dis cover how Demartek test engineers documented a 283% gain in IOPS and a reduction in latency by a factor of 66x after incorporating NetApp Flash Pool technology.
Tags : 
    
NetApp
Published By: Pure Storage     Published Date: Mar 15, 2018
The all-flash array (AFA) market has undergone significant maturation over the past two years. A high percentage of customers have already committed to an "all flash for primary storage" strategy, and every customer interviewed for this study was among those. In 2017, AFAs will drive over 80% of all primary storage revenue. All of the established storage vendors have entered this space, and there are several start-ups with over $100 million in revenue. With this level of market maturation, multiple segments have developed within the primary flash array space. There are systems targeted for dedicated application deployment, there are systems specifically for web-scale applications, and there are systems intended for dense mixed workload consolidation. These latter systems are driving most of the AFA revenue, and they aspire to become the primary storage platforms of record for enterprises of all sizes. This study evaluates the suitability of 10 vendors' AFA platforms for dense mixed ent
Tags : 
    
Pure Storage
Published By: Oracle     Published Date: Jan 28, 2019
Databases tend to hold an organization’s most important information and power the most crucial applications. It only makes sense, then, to run them on a system that’s engineered specifically to optimize database infrastructure. Yet some companies continue to run their databases on do-it-yourself (DIY) infrastructure, using separate server, software, network, and storage systems. It’s a setup that increases risk, cost, complexity, and time spent deploying and managing the systems, given that it typically involves at least three different IT groups.
Tags : 
    
Oracle
Published By: HP     Published Date: Feb 02, 2015
In this study we compared the attributes of storage efficiency and ease of managing and monitoring an EMC VNX unified array versus an HP 3PAR StoreServ unified array. The approach we used was to setup two arrays side-by-side and recorded the actual complexity of managing each array for file and block access, per the documents and guides provided for each product. We also went through the exercise of sizing various arrays via publicly available configuration guides to see what the expected storage density efficiency would be for some typically configured systems. Our conclusion was nothing short of astonishment. Read this whitepaper to learn more.
Tags : 
    
HP
Published By: HP     Published Date: Feb 20, 2015
Integrated Computing (IC), also known as Converged Infrastructure, stands to revolutionize everything about IT. IC takes all of virtualization, storage, networking, and compute into a single system and allows total infrastructure oversight through one console. As you can imagine, the improvements in ease of use, agility, and efficiency are quite substantial. This is essential because infrastructure maintenance devours about three-fourths of annual IT budgets. IC changes all of that, delivering high efficiency solutions for the lifecycle of system,reducing costs and freeing up valuable personnel. The evolution of IT awaits.
Tags : 
    
HP
Published By: NetApp     Published Date: Sep 22, 2014
NetApp Flash Pool is a storage cache option within the NetApp Virtual Storage Tier product family, available for NetApp FAS storage systems. A Flash Pool configures solid state drives (SSDs) and hard disk drives (HDDs) into a single storage pool, known as an “aggregate” in NetApp parlance, with the SSDs providing a fast response time cache for volumes that are provisioned on the Flash Pool aggregate. In this lab evaluation, NetApp commissioned Demartek to evaluate the effectiveness of Flash Pool with different types and numbers of hard disk drives using an online transaction processing (OLTP) database workload, and to evaluate the performance of Flash Pool in a clustered Data ONTAP environment during a cluster storage node failover scenario. In the report, you’ll dis cover how Demartek test engineers documented a 283% gain in IOPS and a reduction in latency by a factor of 66x after incorporating NetApp Flash Pool technology.
Tags : 
flash pool, fas storage systems, ssd, online transaction processing, cluster storage
    
NetApp
Published By: HP     Published Date: Feb 02, 2015
Integrated Computing (IC), also known as Converged Infrastructure, stands to revolutionize everything about IT. IC takes all of virtualization, storage, networking, and compute into a single system and allows total infrastructure oversight through one console. As you can imagine, the improvements in ease of use, agility, and efficiency are quite substantial. This is essential because infrastructure maintenance devours about three-fourths of annual IT budgets. IC changes all of that, delivering high efficiency solutions for the lifecycle of system,reducing costs and freeing up valuable personnel. The evolution of IT awaits.
Tags : 
    
HP
Published By: HP     Published Date: Feb 02, 2015
Integrated Computing (IC), also known as Converged Infrastructure, stands to revolutionize everything about IT. IC takes all of virtualization, storage, networking, and compute into a single system and allows total infrastructure oversight through one console. As you can imagine, the improvements in ease of use, agility, and efficiency are quite substantial. This is essential because infrastructure maintenance devours about three-fourths of annual IT budgets. IC changes all of that, delivering high efficiency solutions for the lifecycle of system,reducing costs and freeing up valuable personnel. The evolution of IT awaits.
Tags : 
    
HP
Published By: HP     Published Date: Feb 11, 2015
In this study we compared the attributes of storage efficiency and ease of managing and monitoring an EMC VNX unified array versus an HP 3PAR StoreServ unified array. The approach we used was to setup two arrays side-by-side and recorded the actual complexity of managing each array for file and block access, per the documents and guides provided for each product. We also went through the exercise of sizing various arrays via publicly available configuration guides to see what the expected storage density efficiency would be for some typically configured systems. Our conclusion was nothing short of astonishment. Read this whitepaper to learn more.
Tags : 
    
HP
Published By: MarkLogic     Published Date: Mar 13, 2015
Big Data has been in the spotlight recently, as businesses seek to leverage their untapped information resources and win big on the promise of big data. However, the problem with big data initiatives are that organizations try using existing information management practices and legacy relational database technologies, which often collapse under the sheer weight of the data. In this paper, MarkLogic explains how a new approach is needed to handle the volume, velocity, and variety of big data because the current relational model that has been the status quo is not working. Learn about the NoSQL paradigm shift, and why NoSQL is gaining significant market traction because it solves the fundamental challenges of big data, achieving better performance, scalability, and flexibility. Learn how MarkLogic’s customers are reimagining their data to: - Make the world more secure - Provide access to valuable information - Create new revenue streams - Gain insights to increase market share - Reduce b
Tags : 
enterprise, nosql, relational, databases, data storage, management system, application, scalable, enterprise applications, data management
    
MarkLogic
Published By: MarkLogic     Published Date: Mar 17, 2015
You’ve probably heard about NoSQL, and you may wonder what it is. NoSQL represents a fundamental change in the way people think about storing and accessing data, especially now that most of the information generated is unstructured or semi-structured data — something for which existing database systems such as Oracle, MySQL, SQLServer, and Postgres aren’t well suited. NoSQL means a release from the constraints imposed on database management systems by the relational database model. This free eBook, Enterprise NoSQL for Dummies, MarkLogic Special Edition, provides an overview of NoSQL. You’ll start to understand what it is, what it isn’t, when you should consider using a NoSQL database instead of a relational database management system and when you may want to use both. In addition, this book introduces enterprise NoSQL and shows how it differs from other NoSQL systems, as well as explains when NoSQL may not be the right solution for your data storage problem. You’ll also learn the NoSQ
Tags : 
enterprise, nosql, relational, databases, data storage, management system, application, scalable, enterprise applications, data management
    
MarkLogic
Published By: Teradata     Published Date: Aug 27, 2015
While your specific data throughput mileage may vary, this paper will share the factors that directly contribute to a Data Movement solution, the data storage ecosystem alternatives, and the pitfalls you may face when attempting to architect such a solution in your data warehouse environment.
Tags : 
analytics, big data, applications, lob, application integration, analytical applications, configuration management
    
Teradata
Published By: Druva     Published Date: Nov 09, 2018
The rise of virtualization as a business tool has dramatically enhanced server and primary storage utilization. By allowing multiple operating systems and applications to run on a single physical server, organizations can significantly lower their hardware costs and take advantage of efficiency and agility improvements as more and more tasks become automated. This also alleviates the pain of fragmented IT ecosystems and incompatible data silos. Currently, this virtualization juggernaut shows no sign of slowing. As businesses recognize the potential for increased reliability and scalability offered by virtual technology, they are ramping up their investments in data center modernization and upgrading. In fact, 33 percent of the respondents to a recent ESG survey on cloud usage said that making greater use of server virtualization was one of their top five spending priorities for the next 12 to 18 months.
Tags : 
    
Druva
Start   Previous    1 2 3 4 5 6 7 8 9 10 11 12 13 14    Next    End
Search Resource Library      

Add Resources

Get your company's resources in the hands of targeted business professionals.