production data

Results 1 - 25 of 97Sort Results By: Published Date | Title | Company Name
Published By: Intel     Published Date: Feb 28, 2019
Keeping the lights on in a manufacturing environment remains top priority for industrial companies. All too often, factories are in a reactive mode, relying on manual inspections that risk downtime because they don’t usually reveal actionable problem data. Find out how the Nexcom Predictive Diagnostic Maintenance (PDM) system enables uninterrupted production during outages by monitoring each unit in the Diesel Uninterrupted Power Supplies (DUPS) system noninvasively. • Using vibration analysis, the system can detect 85% of power supply problems before they do damage or cause failure • Information processing for machine diagnostics is done at the edge, providing real-time alerts on potential issues with ample of lead time for managers to rectify • Graphic user interface offers visual representation and analysis of historical and trending data that is easily consumable
Tags : 
    
Intel
Published By: SAS     Published Date: Jan 17, 2018
The Industrial Internet of Things (IIoT) is flooding today’s industrial sector with data. Information is streaming in from many sources — equipment on production lines, sensors at customer facilities, sales data, and much more. Harvesting insights means filtering out the noise to arrive at actionable intelligence. This report shows how to craft a strategy to gain a competitive edge. It explains how to evaluate IIoT solutions, including what to look for in end-to-end analytics solutions. Finally, it shows how SAS has combined its analytics expertise with Intel’s leadership in IIoT information architecture to create solutions that turn raw data into valuable insights.
Tags : 
    
SAS
Published By: Carbonite     Published Date: Jul 18, 2018
© 2018 Carbonite, Inc. All rights reserved. Case study Diamond Foods’ Diamond of California® nuts are household staples for shoppers across the U.S. But constantly filling grocery store shelves with snacks requires intricate supply chain management that relies on critical business data, including complex spreadsheets and enterprise resource planning files, to keep production and deliveries on schedule. “If our critical servers go down or we lose important data on employee laptops, it has a direct impact on our bottom line,” says Kentrell Davis, Senior Client Support Services Analyst at Diamond Foods.
Tags : 
    
Carbonite
Published By: Dell EMC     Published Date: Aug 17, 2018
Dell EMC technology for Digital Manufacturing harnesses the workstation, HPC and storage capabilities that combine to enable better products, more efficient design and production processes, and meet rapidly changing customer preferences. Collecting, collating and digesting more and more data in the entire ecosystem, from product modelling to after-sales trends, are making the digital factory a powerful and necessary reality in the manufacturing landscape.
Tags : 
    
Dell EMC
Published By: Dell EMC     Published Date: Nov 26, 2018
Dell EMC technology for Digital Manufacturing harnesses the workstation, HPC and storage capabilities that combine to enable better products, more efficient design and production processes, and meet rapidly changing customer preferences. Collecting, collating and digesting more and more data in the entire ecosystem, from product modelling to after-sales trends, are making the digital factory a powerful and necessary reality in the manufacturing landscape. Learn more about Dell Precision® workstations featuring Intel® Xeon® processors
Tags : 
    
Dell EMC
Published By: Dell EMC     Published Date: Nov 26, 2018
Dell EMC technology for Digital Manufacturing harnesses the workstation, HPC and storage capabilities that combine to enable better products, more efficient design and production processes, and meet rapidly changing customer preferences. Collecting, collating and digesting more and more data in the entire ecosystem, from product modelling to after-sales trends, are making the digital factory a powerful and necessary reality in the manufacturing landscape. Learn more about Dell Precision® workstations featuring Intel® Xeon® processors
Tags : 
    
Dell EMC
Published By: Delphix     Published Date: May 03, 2016
High-profile data breaches continue to make headlines as organizations struggle to manage information security in the face of rapidly changing applications, data centers, and the cloud. Against this backdrop, data masking has emerged as one of the most effective ways to protect sensitive test data from insider and outsider threats alike. While masking is now the de facto standard for protecting non-production data, implementing it alongside virtual data technologies has elevated its effectiveness even further.
Tags : 
    
Delphix
Published By: Delphix     Published Date: May 03, 2016
Today's test data management (TDM) solutions force teams to work with compromised data sets, and push testing to too late in the software development lifecycle. The end result is rework, delayed releases, and costly bugs that cripple production systems. Furthermore, prevailing approaches to test data management - including subsetting, synthetic data, shared environments, and standalone masking--represent flawed solutions that fail across one or more key dimensions.
Tags : 
    
Delphix
Published By: TIBCO Software GmbH     Published Date: Jan 15, 2019
The current trend in manufacturing is towards tailor-made products in smaller lots with shorter delivery times. This change may lead to frequent production modifications resulting in increased machine downtime, higher production cost, product waste—and no need to rework faulty products. To satisfy the customer demand behind this trend, manufacturers must move quickly to new production models. Quality assurance is the key area that IT must support. At the same time, the traceability of products becomes central to compliance as well as quality. Traceability can be achieved by interconnecting data sources across the factory, analyzing historical and streaming data for insights, and taking immediate action to control the entire end-to-end process. Doing so can lead to noticeable cost reductions, and gains in efficiency, process reliability, and speed of new product delivery. Additionally, analytics helps manufacturers find the best setups for machinery.
Tags : 
    
TIBCO Software GmbH
Published By: Red Hat Government     Published Date: Jul 21, 2011
Organizations around the world are benefiting from public clouds. But, when production applications or critical data is involved, it's important to extend on-premise governance to your public or hybrid cloud resources. Effective cloud governance is possible.
Tags : 
cloud governance, governance, cloud computing, interoperability, red hat, open source, idc, enterprise linux, enterprise solutions, virtualization, linux, unix, redhat, government, infrastructure
    
Red Hat Government
Published By: SolidFire_NetApp     Published Date: Oct 10, 2016
This paper introduces five architectural principles guiding the development of the next generation data center (NGDC). It describes key market influences leading a fundamental enterprise IT transformation and the technological trends that support it. The five principles are: scale-out, guaranteed performance, automated management, data assurance, and global efficiencies. Cloud infrastructure delivery models such as IaaS, private clouds, and software-defined data centers (SDDC) are foundations for the NGDC. In an era where IT is expected to ensure productiongrade support with a plethoric flow of new applications and data, these models demonstrate how to eliminate bottlenecks, increase self-service, and move the business forward. The NGDC applies a software-defined everything (SDx) discipline in a traditional, hardware-centric business to gain business advantage.
Tags : 
    
SolidFire_NetApp
Published By: CA Technologies EMEA     Published Date: Sep 07, 2018
There are five ways to provision test data. You can copy or take a snapshot of your production database or databases. You can provision data manually or via a spreadsheet. You can derive virtual copies of your production database(s). You can generate subsets of your production database(s). And you can generate synthetic data that is representative of your production data but is not actually real. Of course, the first four examples assume that the data you need for testing purposes is available to you from your production databases. If this is not the case, then only manual or synthetic data provision is a viable option. Download this whitepaper to find out more about how CA Technologies can help your business and its Test Data problems.
Tags : 
    
CA Technologies EMEA
Published By: Oracle ZDLRA     Published Date: Jan 10, 2018
Traditional backup systems fail to meet the database protection and recovery requirements of modern organizations. These systems require ever-growing backup windows, negatively impact performance in mission-critical production databases, and deliver recovery time objectives (RTO) and recovery point objectives (RPO) measured in hours or even days, failing to meet the requirements of high-volume, high transactional databases -- potentially costing millions in lost productivity and revenue, regulatory penalties, and reputation damage due to an outage or data loss.
Tags : 
data protection, backup speed, recovery, overhead, assurance, storage, efficiency, oracle
    
Oracle ZDLRA
Published By: CA Technologies_Business_Automation     Published Date: Jun 29, 2018
Challenge It is not uncommon for SAP system copies, including any post-editing, to take several days to complete. Meanwhile, testing, development and training activities come to a standstill, and the large number of manual tasks in the entire process ties up highly skilled SAP BASIS staff. Opportunity Enterprises are looking to automation as a way to accelerate SAP system copies and free up staff. However, this is only one part of the problem: What further complicates the system copy process is the need to safeguard sensitive data and manage huge data volumes while also ensuring that the data used in non-production systems adequately reflects the data in production systems so the quality of development, testing and training activities is not compromised. Benefits This white paper explains how a considerable portion of the SAP system copy process can be automated using the CA Automic Automated System Copy for SAP solution and SNP T-Bone, helping enterprises become more agile.
Tags : 
    
CA Technologies_Business_Automation
Published By: JBoss Developer     Published Date: Oct 21, 2016
In April 2014, Nissan opened a new production plant in Rio de Janeiro, Brazil with an annual production capacity of 200,000 vehicles and 200,000 engines. Along with a target of reaching 5% market share in Brazil by 2016, the plant’s opening was aligned with Nissan’s efforts to achieve 8% global market share, part of the “NISSAN POWER 88” five-year mid-term management plan initiated in 2011. To help align their strategic and business goals with their use of technology, Nissan chose Red Hat JBoss BRMS to replace their legacy system.
Tags : 
nissan, production, it optimization, strategic partners, data, enterprise resource planning, enterprise software, data warehousing
    
JBoss Developer
Published By: CA Technologies     Published Date: Sep 13, 2017
"The Implications for Test Data Management The GDPR is set to have wide-ranging implications for the type of data which can be used in non-production environments. Organizations will need to understand exactly what data they have and who’s using it, and be able to restrict its use to tasks where they have consent. Learn more about how you can protect the data that matters most and comply with the GDPR."
Tags : 
    
CA Technologies
Published By: CA Technologies     Published Date: Sep 13, 2017
"There's new legislation in place, that's expanded the definition of personal data and puts IT and testing departments on high alert to safeguard personal data, across testing and development environments. It's the General Data Protection Regulation (GDPR). Are you ready for it? In this session, we’ll demonstrate how CA Test Data Manager helps to both mask your production data and to generate synthetic test data; a powerful combination to help you meet compliance needs and deliver quality applications. There will be a short section on the future of the tester self-service model that will enable testers to efficiently get access to the right test data."
Tags : 
    
CA Technologies
Published By: Attunity     Published Date: Nov 15, 2018
Change data capture (CDC) technology can modernize your data and analytics environment with scalable, efficient and real-time data replication that does not impact production systems. To realize these benefits, enterprises need to understand how this critical technology works, why it’s needed, and what their Fortune 500 peers have learned from their CDC implementations. This book serves as a practical guide for enterprise architects, data managers and CIOs as they enable modern data lake, streaming and cloud architectures with CDC. Read this book to understand: ? The rise of data lake, streaming and cloud platforms ? How CDC works and enables these architectures ? Case studies of leading-edge enterprises ? Planning and implementation approaches
Tags : 
optimize customer service
    
Attunity
Published By: CA Technologies EMEA     Published Date: Oct 23, 2017
Pour dynamiser le marché ou rester compétitives, les entreprises doivent livrer leurs produits logiciels plus rapidement que jamais. Les méthodes traditionnelles d’assurance qualité ne sont cependant pas en mesure de supporter cette cadence soutenue. Bien qu’ils aient aidé les organisations à améliorer la qualité du code, les mécanismes d’intégration continue ne prennent pas en charge les tests de bout en bout à l’échelle du cycle de vie du code.
Tags : 
continuous testing, déploiement automatique, livraison continue, test, assurance qualité, développement, qualité du code, test, ca test data manager
    
CA Technologies EMEA
Published By: CA Technologies EMEA     Published Date: Oct 23, 2017
Dans les organisations agiles d’aujourd’hui, les équipes de production se trouvent face à un défi de taille : déployer en production les nouvelles versions immédiatement après les phases de développement et de test. Pour assurer la réussite d’un tel déploiement, il est nécessaire de mettre en œuvre un processus automatique et transparent. ce processus, nous l’avons baptisé Zero Touch Deployment™. cet article examine deux approches du Zero Touch Deployment : une solution basée sur les scripts et une plate-forme d’automatisation de la mise en production. Il indique comment chacune de ces approches peut résoudre les principaux défis technologiques et organisationnels face auxquels se trouvent les organisations agiles lorsqu’elles décident d’implémenter un système de déploiement automatique. cet article commence par retracer le contexte commercial et technologique qui pousse les organisations agiles à se tourner vers des solutions d’automatisation du déploiement.
Tags : 
zero touch deployment, continuous testing, déploiement automatique, livraison continue, test, assurance qualité, développement, qualité du code, test, ca test data manager
    
CA Technologies EMEA
Published By: CA Technologies EMEA     Published Date: Oct 23, 2017
Pour pouvoir réellement mettre en œuvre une approche de livraison continue, les organisations doivent entièrement repenser la façon de mener leur processus d’assurance qualité (QA). Cela passe notamment par redéfinir le rôle que les professionnels d’assurance qualité jouent au sein de l’organisation, automatiser le plus possible à chaque niveau et revoir entièrement les structures de test, pour prendre en charge des versions logicielles plus légères et plus rapides.
Tags : 
continuous testing, déploiement automatique, livraison continue, test, assurance qualité, développement, qualité du code, test, ca test data manager
    
CA Technologies EMEA
Published By: CA Technologies EMEA     Published Date: Oct 23, 2017
Faites passer rapidement vos idées du stade de la conception au stade du développement, sans sacrifier la qualité, grâce à un écosystème de livraison continue de bout en bout capable de mettre en œuvre des tests rigoureux selon la fonctionnalité utilisateur souhaitée.
Tags : 
continuous testing, déploiement automatique, livraison continue, test, assurance qualité, développement, qualité du code, test, ca test data manager
    
CA Technologies EMEA
Published By: CA Technologies EMEA     Published Date: Oct 23, 2017
Dans une série d’articles, Paul Gerrard, consultant et spécialiste des tests, aborde diverses questions sur ces derniers. Les modèles de test sont essentiels pour les tests et, dans cet article, Paul Gerrard évoque l’art de créer et d’utiliser ces modèles. Si les testeurs interviennent de plus en plus tôt, de pair avec les développeurs ou du moins plus étroitement, les testeurs (et les développeurs) doivent être capables de créer des modèles, de savoir comment les articuler et les partager, ainsi que d’encourager une meilleure collaboration.
Tags : 
modèles de test, continuous testing, déploiement automatique, livraison continue, test, assurance qualité, développement, qualité du code, test, ca test data manager
    
CA Technologies EMEA
Published By: CA Technologies EMEA     Published Date: Oct 24, 2017
Générez des données virtuelles riches qui couvrent tous les scénarios possibles et fournissent un accès illimité aux environnements nécessaires pour livrer des applications testées avec soin, dans les délais et le budget impartis. Modélisez les données des systèmes réels complexes et appliquez des algorithmes d’apprentissage automatisé de règles pour éliminer la dette technique et permettre une compréhension approfondie des applications composites. Mettez en outre à la disposition des équipes distribuées des données virtuelles à la demande et évitez les goulots d’étranglement au niveau des tests.
Tags : 
modèles de test, continuous testing, déploiement automatique, livraison continue, test, assurance qualité, développement, qualité du code, ca test data manager
    
CA Technologies EMEA
Published By: SAS     Published Date: Apr 16, 2015
The framework presented here is a way to avoid data dysfunction via a coordinated and well-planned governance initiative. These initiatives require two elements related to the creation and management of data: • The business inputs to data strategy decisions via a policy development process. • The technology levers needed to monitor production data based on the policies. Collectively, data governance artifacts (policies, guiding principles and operating procedures) give notice to all stakeholders and let them know, “We value our data as an asset in this organization, and this is how we manage it.”
Tags : 
    
SAS
Start   Previous   1 2 3 4    Next    End
Search Resource Library      

Add Resources

Get your company's resources in the hands of targeted business professionals.