Broadcom
Last Updated:
Analyst Coverage: Philip Howard, Daniel Howard, David Norfolk, Fran Howarth and Paul Bevan
Broadcom Inc. has been a global infrastructure technology leader for about 50 years, and developed out of AT&T/Bell Labs, Lucent and Hewlett-Packard/Agilent. Broadcom was originally, and still is, a global designer, developer and manufacturer of semiconductors and infrastructure software solutions.
Broadcom now focuses on global connectivity technology and the original Broadcom has grown by the acquisition of LSI, Brocade, CA Technologies and Symantec. It is now a very large company.
Bloor’s relationship with Broadcom largely grew out of its previous relationships with:
- CA Technologies (a Broadcom Company), which creates systems software that supports and manages mainframe, distributed computing and virtual machine environments, as well as application delivery, security, data centre infrastructure, and testing software (such as test automation, and test data management). It operates primarily in the B2B space and claims that its customers include the majority of the Forbes Global 2,000 companies.
- Symantec (a division of Broadcom), which is a large multinational software vendor with operations in more than 50 countries, that is focused on security.
Agile Designer
Last Updated: 22nd February 2015
Agile Designer is an end-to-end test case design and requirements definition tool, which allows users to automatically generate test cases, linked to test data, virtual assets and expected results. As a part of this, it can find and make test data, map requirements to active flowcharts, automatically generate automation frameworks, calculate the cost and complexity of requirements, and quickly manage changes.
The problem for testers is testing the right things, just enough. You don't want to over-test or under-test and you want to test the things that are important. Automation can significantly help with these issues.
Historically, automated test case generation has centred around two major approaches: pairwise testing and cause and effect modelling. The former is limited in its applicability, does not lend itself to collaboration with business users, offers no way to tie it back to the requirements of the application under test, and does not include any mechanisms to support real-world constraints. Cause and effect modelling is much more comprehensive and is requirements-driven. However, it is complex and requires special training to use. Grid-Tools' Agile Designer supports both of these methods, plus constraint modelling, but also introduces a third way.
Agile Designer's third way is based on flow chart modelling. While this isn't as perfect as cause and effect modelling at discovering defects it is significantly better at this than pairwise testing or other methods. Conversely, it is much easier to use than either of the other methods and is easy for business users to understand, which is especially important in agile environments.
Agile Designer integrates with Datamaker and there is a Test Data on Demand application that allows users to expose Datamaker functions to testers.
Grid-Tools is not fussy about who uses Agile Designer and has no particular industry focus although it does, to a certain extent, focus on the Healthcare and banking sectors, where it has had some notable success. Both of these sectors are ones that are heavily regulated and where compliance requirements around the protection of personal information are paramount.
More generically, the company's main emphasis is on companies that agree with its data-driven approach and who appreciate that agile development is as much about the data, and particularly the test data, as it is about the development processes themselves. It would argue, and we would agree, that you can't have a truly agile development process without agile test data to go with it.
In addition to its own teams, the company has an extensive network of partners across the globe, with trained staff in over 18 countries worldwide. These are split between regional resellers that serve the needs of local markets, and global strategic partners, which include CA Technologies, Accenture and HP. Partners in the Americas include Orasi Software and Softworx, while Central and South America is served by Green Light Technology. In Europe, the Middle East and Asia, partners include ANECON GmbH, Blue Turtle Technologies, Cast Info, INFA Partner, Infuse, Lemontree, Sothis Yazlim, Spica Solutions, WSM, MTP, Soflab Technology, and Software AG.
Grid-Tools has had some significant success in the financial sector though none of its major banking clients can be named. Projects range from establishing a new data warehouse to migrations. Government contracts and healthcare are also notable but again unnamed. The company's website provides a number of case studies though none of the named users will be familiar to the man on the Clapham omnibus.
While offering both pairwise testing and cause and effect modelling as options we will concentrate here on those features of Agile Designer that make it unique, in particular its use of flow charts. To begin with, flow charts are directly linked to requirements. If you already have the latter in place using some third party tools (HP ALM, TIBCO Nimbus, Cordys, Critical Logic TMX or VersionOne) then there are facilities to import and reverse engineer those requirements so that they can be presented as a set of flow charts. One of the effects of this process is that you can easily identify ambiguities in the original requirements.
Flow charts are colour-coded so that you can differentiate visually between possible and impossible (constrained) paths and between those paths that have been tested and which have no defects, those that have been tested and which have defects (which are highlighted), and those that have not been tested yet. You can select a sub-flow and work on that element of the overall requirements set and you can also prioritise. In any case, the software will automatically generate all the relevant test cases within a minimal number of test cases, while coverage metrics can be generated to know how much of a system is being tested. The flow charts are very easy to understand and will help collaboration with business users.
Going beyond test case generation itself Agile Designer provides test case management capabilities. These include the ability to import existing test cases that you may have developed so that these can be analysed in a fully managed environment. This allows you to do things such as identifying duplicate and zombie test cases that are no longer required, as well as enabling reuse.
This is particularly useful when managing changing requirements, as it reduces the need to manually update test cases and scripts. Instead, users can simply add or remove functional logic from the flowchart, and Agile Designer will automatically remove or repair broken, redundant or invalid tests, and will automatically generate any new ones required.
When used in conjunction with Grid-Tools' Datamaker the requisite data for each test case can be derived - this can radically reduce the effort required in identifying and accessing the right data to use for any particular test case - and there is a portal (Test Data on Demand) which enables the sharing of both test cases and test data. In addition to working with Datamaker, Agile Designer also works in conjunction with Grid-Tools' (and third party) service virtualisation offering.
Agile Designer also integrates with a number of technologies that companies may already be using to create test cases and automated test scripts, as well as several project management and test case design tools. In this context, Agile Designer serves as an accelerator to existing technologies, allowing users to optimize existing requirements and test cases, automatically generate automation frameworks, calculate cost and complexity, and manage change.
Grid-Tools offers a range of professional services which supplement the company's core product range.
Potential clients can sign up for a free 15 day trial of all of Grid-Tools primary solutions. During this period, they will receive the full support of Grid-Tools' consultants, who will help demonstrate how the solutions can most benefit their development projects.
Once a client has settled on a tool, Grid-Tools aim to help them get the greatest benefit out of it within their organisation. In addition to a full range of consultancy work packages, introductory training courses and workshops for users of all technical ability are offered, and are frequently held in local regions including the UK, USA and India.
For companies wishing to "shift left" testing, Grid-Tools consultants can work closely with an organisations' business analysts, to remove ambiguity from requirements. They can then assist in using Agile Designer to optimize test case design and improve software development.
A one day Agile Designer basics training course is offered, to help users grasp the basics of visual flows and functionality, as well as more advanced functionality such as the HP ALM/QC integration. An Implementing Agile Designer workshop provides bespoke advice, based on existing test case design practices, on how to improve daily processes, and how test cases are stored and managed.
Application Delivery
Last Updated: 6th December 2013
The CA Technologies Application Delivery solution is generally branded CA Lisa. It is, in essence, the CA Technologies take on DevOps, which means the removing of constraints from the software delivery lifecycle, the release and deployment of its outcomes and the capture of defect information and live traffic patterns for regression testing and the creation of simulated environments.
CA sells its products largely through its global direct channel, although it has been developing its partner programs lately. It has, for example, a relationship with SQS Software Quality Systems AG, a leading specialist in software quality and the first German company to list on the AIM (Alternative Investment Market) in London (as well as trading on the German Stock Exchange in Frankfurt am Main). SQS is headquartered in Germany and has about 2,800 staff, with offices in Germany, the UK, Egypt, Finland, France, India, Ireland, the Netherlands, Norway, Austria, Sweden, Switzerland, South Africa and the US. This gives CA Technologies access to SQS' client base, including half of the DAX 30, nearly a third of the STOXX 50 and 20 per cent of the FTSE 100 companies.
CA Technologies targets large multinational companies and government organisations such as California's Department of Motor Vehicles (DMV).
The vision of CA Technologies for Application Delivery comprises:
- CA Lisa Service Virtualisation;
- CA Lisa Release Automation;
- CA Lisa Data Mining.
Service virtualisation is about simulating and modeling systems that aren't built yet or unavailable for practical reasons (often because they are committed to the business). This allows testing to begin earlier in the lifecycle (with less demands on infrastructure and software licensing) and enables parallel development and testing, which reduces cycle times and increases IT productivity.
Release automation automates the agile software development, test and production workflows, which promotes continuous application delivery and thus reduces deployment errors and time-to-market. This needs to deal with multi-tier application release deployment across distributed, heterogeneous infrastructures and also provide feedback of management information from the process.
Data Mining, as CA Technologies uses the term, is about rapidly generating automated regression test suites by capturing live traffic during application use; automation of the creation and management of virtual services and the automation and simpliffication of defect capture, reporting and root cause analysis.
CA Technologies offers expected enterprise levels of services and support, with global 24x7x365 support if necessary, as well as classroom and web-based training.
Application performance and capacity management
Last Updated: 20th January 2014
The CA Technologies APM (Application Performance Management) approach is end-to-end, ranging from the most modern cloud environments, through distributed systems, to the mainframe applications that often still support enterprises' business and support customer-facing web front-ends. It uses analytics to manage the complexity of many application environments with proactive performance alerts and the identification of root causes.
It claims to back this up with better capacity planning models - running out of storage or processing capacity is, in effect, the ultimate performance issue.
Application Performance Management is an important part of the overall feedback loop that drives DevOps at scale, which is based on the analysis of the business user experience as an input to a continuous design, build and delivery pipeline. If one sees IT Governance as the oversight of a company's investment in business automation technology, in the interests of achieving better business outcomes without waste, performance management and reporting is an enabler for better governance.
CA Technologies sells its products largely through its global direct channel, although it has strengthened its partner programs in recent times.
CA Technologies targets large enterprise customers, often companies managing infrastructure used to deliver products that the customers of CA Technologies themselves sell on. For example:
- Networking: Alvaka Networks uses CA Nimsoft Monitor to cut its administrative and maintenance costs and, through proactive troubleshooting and real-time monitoring and alerts, help its customers to achieve highly reliable delivery.
- Telecommunications: Arcor, part of the Vodaphone Group, uses CA Spectrum Infrastructure Manager and CA eHealth Performance Manager to speed up fault resolution through centralised alerting and the capture of performance metrics. It claims to have reduced its IT networking overhead by 20 percent, with a unified systems management approach.
- Cloud services: Ativas provides hybrid managed services through both traditional and cloud-based platforms, delivered from a world-class European data centre. It uses a number of CA Technologies' products to support high reliability service delivery through best practice processes and good governance of managed IT services.
- Inormation services: DATEV is one of the leading information service providers in Germany, supporting more than 40,000 members and customers globally. It uses APM from CA Technologies to safeguard QoS from some 700-plus online applications monitored on multiple platforms.
APM from CA Technologies is built around many products including:
- CA Application Performance Management, for end-to-end transaction performance;
- CA Application Delivery Analysis - end-to-end application response time monitoring, with pro-active alerts, service desk and operations views and third-party integrations;
- Cloud Performance Management - using CA Cloud Monitor and CA Nimsoft Cloud User Experience Monitor for easy SaaS monitoring.
- Capacity management, using the data from APM etc. as an input to CA Capacity Management, which allows you to right-size your application delivery environment.
Broadcom (CA) Test Data Manager and Blaze Data
Last Updated: 12th July 2021
Mutable Award: Gold 2021
Test Data Manager (TDM) is a test data management solution that uses data subsetting, data masking, and synthetic data generation to produce standardised, optimally covering, up-to-date, desensitised, and on-demand test data. In the process, it profiles your production data, creating an easily understood view of your data relationships and what data exists where. It’s accessed through a web portal, with guided workflows and instructional videos built-in to assist with its operation. It’s compatible with a variety of data sources, including DB2, Oracle, SQL Server and PostgreSQL, and supports both mainframe and distributed environments.
TDM readily integrates with other Broadcom products, including Agile Requirements Designer (ARD) and BlazeMeter. The latter has recently expanded significantly, and now stretches beyond just performance testing: it even offers its own synthetic data solution, BlazeData. TDM is still preferable for more complex test data management needs, while BlazeData excels at generating and delivering test data rapidly, dynamically, and at any time, up to and including test execution.
Customer Quotes
“With Test Data Manager we have evolved beyond manual processes to a powerful automated tool that can generate massive amounts of data in a very short amount of time.”
Manheim
“CA Technologies offered a compelling solution with its integrated tools that provide seamless, end-to-end
test management. In addition, CA Technologies solutions were able to manage all the technologies used across our applications, which other solutions were unable to do.”
Williams
TDM works by automatically building a model of your data and its environment, then leveraging that model to create test data. To wit, it will profile your data and thereby detect the relationships and sensitive data contained therein. Moreover, it will produce a variety of visualisations based on your model, including a heat map that visualises PII hotspots (see Figures 1 and 2).
Data subsetting is available and supported by rules-driven data masking. Rules can be applied to either columns or tags, with over eighty masking rules provided out of the box (more can be added manually). Masking always maintains referential integrity, is fully auditable, can be applied to millions of rows in a matter of minutes either in-place or in-flight, and is demonstrably compliant with various mandates including GDPR. It can be applied to any number of environments at once, can be scheduled to run automatically, and displays a preview of the end results before committing them. You can also choose to retain your pre-masked data after the process has finished, allowing you to revert the process if necessary. If you change your mind, said data can be deleted easily.
Synthetic data generation is also available, and moreover, it is a standout feature. It relies on a user-created model of valid test data attributes that, once built, can be used to automatically generate synthetic data that achieves up to one hundred percent coverage while still being representative of your production data. This includes generating outliers, unexpected results, boundary conditions and negative paths. It is also resilient in the face of schema changes and so on.
In addition, the product is designed to make test data easily accessible. For example, you can set up self-service and automated delivery of test data to testing teams; the product will dynamically build a test data warehouse (or mart) that functions as a central library of test data that can be reused on demand; and there is a ‘find and reserve’ feature that allows you to prevent data from being modified while you’re preparing it for use as test data.
TDM can (optionally) be deployed via a Docker container, although it loses all of its subsetting and some of its synthetic data functionality in the process. That said, this pattern is highly scalable, capable of parallelising jobs across many separate instances of TDM. One client, for instance, has purportedly used this method to mask 120 billion lines of test data in a matter of hours.
BlazeData is a second test data solution within BlazeMeter, that can work either by itself or in concert with TDM. It is comparatively lightweight, both in functionality and ease of use, but notably allows you to tightly couple your test data to your functional tests. This automatically generates and delivers synthetic data to your tests on-demand and at runtime, enabling your tests to be ‘self-defining’ and data-driven. You can also import test data from TDM for use in the same fashion. There is much more to say about BlazeData, but sadly, that will have to wait for another report.
TDM offers particularly mature synthetic data capabilities, including the ability to generate truly representative synthetic data sets via analysis of your production data. It also features competitive data subsetting, masking and profiling capabilities, making it an ideal solution if you want to combine synthetic data with data subsetting in a single platform. Moreover, TDM’s web portal provides a one-stop-shop for test data provisioning, while the test data warehouse, self-service, automated delivery, and find and reserve features facilitate reuse, collaboration and expedient test creation. These qualities are further enhanced by integration with ARD, itself a leading test design automation product.
BlazeData, on the other hand, provides test data management that is deeply embedded within your functional tests, with compatibility with performance tests, virtual services and more due (we are told) in only a few months. Baking your test data in at this level means that it can be created and delivered automatically and on-demand, automating it almost completely. We see this sort of all-encompassing testing as the next step in test automation, and we’re very glad to see it here.
The Bottom Line
Test Data Manager and BlazeData are each formidable, highly automated test data management solutions. The both of them together are even more so. Whether you prefer the functionality and maturity of TDM or the simplicity and ease of use of BlazeData, at least one of them – if not both – should be on your radar.
Mutable Award: Gold 2021
Broadcom Agile Requirements Designer
Last Updated: 5th February 2020
Mutable Award: Gold 2019
Agile Requirements Designer (ARD) by Broadcom Continuous Testing is a tool for capturing and modelling your requirements visually in the form of flowcharts, from which test cases, test scripts and other testing assets can be generated (and, in the case of test scripts, executed) automatically.
As of the 3.0 release, ARD is provided via two separate components: Studio and Hub. Studio is the classic ARD experience as a standalone desktop application, while Hub is a single, centralised location for storing and managing your models. Hub also contains ARD Insights, which allows you to access and interact with ARD testing artefacts via a web browser, thus exposing them to your organisation as a whole.
Customer Quotes
“The CA Technologies solution [Agile Requirements Designer] will help us increase the efficiency of our business analysts by 10 percent and testers by more than 30 percent over the next three years.”
Rabobank
“CA Technologies has been instrumental in helping us go beyond test automation to achieve end-to-end release automation.”
Williams
“After two business days, we had executed all 137 test scripts with 100 percent coverage and zero defects found.”
a.s.r.
ARD Studio uses flowcharts, assembled using a drag and drop interface, to model your actual requirements visually. One such flowchart can be seen in Figure 1. You can also build your flowcharts automatically by importing recordings from, for example, Selenium Builder. ARD can then use your flowchart to automatically generate the minimum number of tests cases needed to satisfy your desired level of test coverage. Test scripts can be generated automatically based on these test cases, and the nature of the product means that traceability is always preserved between your test cases and your requirements. Existing test cases can be folded into the ARD environment and there are test management capabilities built into the product. Requirements in ARD support integration with BPMN, XPDL and Microsoft Visio, and integration with Test Data Manager (TDM) can be used to provide test data, either synthetic or sourced (and masked), which can be provided on-demand prior to test execution. ARD also integrates with service virtualisation, lifecycle management, performance testing, and test automation frameworks, as well as a number of APIs, both from Broadcom and from third parties such as Micro Focus, Parasoft and Ranorex. The latest release of ARD has also been optimised for handling very large or complex models.
ARD Hub allows you to store and manage all of your projects and flows in a central location. It features a flexible and customisable folder structure, check in/check out functionality, and support for database backups and high availability. It provides permissions-based access to your projects, complete with support for Active Directory, as well as full project versioning in the style of Git. This includes automated model branching, merging, and so on. Subflows are also handled and updated automatically and consistently by this versioning process. In addition, you can choose to view or edit any stored version of your project, each version effectively acting as a ‘snapshot’ of your application at a point in time, which is helpful for understanding previous versions of your system. Finally, ARD Hub has an automated migration capability, which can be used to automatically import all of your existing flows while maintaining existing links to requirements and subflows. This should make upgrading to ARD Hub relatively straightforward.
By storing all of your flows in a central location, ARD Hub opens up additional avenues for providing insight into your testing. This is harnessed by ARD Insights, which provides visibility and traceability into the testing assets, including test flows, that are stored in ARD Hub via the web browser and across your organisation. It also allows you to graphically view and explore the relationships between your flows (for example, subflow relationships) via the dependency visualisation view, as seen in Figure 2.
This can be useful for traceability, for impact analysis, and for understanding your system (for instance, during an onboarding process).
ARD’s major differentiators are twofold: the focus it places on ease of use and collaboration, and the breadth of capability it has access to by virtue of its integration capabilities and position in Broadcom’s Continuous Testing catalogue.
The former can be seen throughout the product. Flows in ARD are simple to assemble and to understand, in part because they model business requirements, not expected behaviour. The full project versioning and automated model merging featured in ARD Hub mean that they can easily be worked on by multiple people, and because they model requirements, and at the same time are visual and relatively simple, they are easy to understand even for nontechnical users. This all makes it relatively easy for testers and, say, business analysts to work side by side on the same project. Moreover, thanks to the release of ARD Hub, as well as ARD Insights making ARD accessible through your browser, it is simple for both technical and nontechnical users across the enterprise to access and explore your flows. This not only makes end-to-end collaboration easier, but allows your flows to be leveraged throughout your organisation to clearly understand your requirements and therefore your systems.
The latter is due to the breadth of testing products that Broadcom offers, as well as the, for the most part, high level of integration between them. ARD can be easily augmented with test data, via TDM; with service virtualisation, via Service Virtualization; and with performance testing and monitoring, via BlazeMeter Continuous Testing Platform. This puts ARD in a prime position, not just to enable automated test design, but to form the core of a comprehensive continuous testing solution and accelerate your in-sprint test automation.
The Bottom Line
ARD is a highly capable, requirements-based test design automation solution with an emphasis on test case collaboration and integration with the Agile release process. If any of those qualities appeal to you – and there’s little reason they shouldn’t – ARD should definitely be on your shortlist.
Mutable Award: Gold 2019
Broadcom Test Data Manager
Last Updated: 1st February 2024
Mutable Award: Gold 2024
Test Data Manager (TDM) is a test data management solution that uses data subsetting, data masking and synthetic data generation to produce secure, standardised, optimally covering, and up-to-date test data at scale, delivered on-demand via self-service. In the process, it profiles your production data, creating an easily understood view of your data relationships and what data exists where. All of its functionality is delivered through a single, unified web portal, with rich API support, guided workflows, and instructional videos built-in to assist with its operation. Although its most obvious use is to support enterprise-level testing, it has also been deployed to help with data security, regulatory compliance, cloud migration, and machine learning use cases.
It is compatible with a variety of data sources, including DB2, Oracle, SQL Server, and PostgreSQL, and it supports mainframe, cloud, and distributed environments. This selection has recently expanded to cover NoSQL databases, with a comparable level of scalability. Accordingly, the product offers native support for various data types specific to individual NoSQL databases; custom wrappers that are used instead of generic JDBC and ODBC connectors; and an improved engine for synthetic data generation that is optimised for the especially large volumes of data that are often found in NoSQL databases. The product can also operate alongside several third-party applications, and includes accelerator content for applications like SAP.
TDM readily integrates and synergises with other Broadcom testing products, including Agile Requirements Designer and Service Virtualization, by supplying them with reusable, realistic test data. The result is that TDM can be deployed alongside these products to create a broad platform for test automation driven primarily by Broadcom solutions. Integration into a wider testing environment is further supported by built-in data orchestration features that can, for example, be used to perform setup and teardown actions respectively before and after producing your test data. This can be very useful for smoothly incorporating your test data into complex and highly automated continuous testing pipelines.
Customer Quotes
“Test Data Manager has ensured compliance by masking millions of rows of complex sensitive information in minutes. Personal data is replaced with realistic but fictitious values, while maintaining the referential integrity needed for testing across each system.”
Leading Canada Financial Institution
“Test Data Manager has enabled the highest possible performance when extracting small, more intelligent subsets from production.”
Large financial group
“Test Data Manager has helped us speed up testing and fill in missing test scenarios by using synthetic data generation and the masking functionally gives us confidence that our client’s personal information is secure.”
Top 3 US Insurance Company
TDM works by automatically building a model of your data and its environment (see Figure 1) then leveraging that model to create test data. It will profile your data and discover the relationships and sensitive data contained therein. It will also produce a variety of visualisations based on your model, such as a heat map that visualises PII hotspots (see Figure 2). There is functionality for creating audit reports for your PII data, which may be helpful for demonstrating regulatory compliance, and there is a unified workflow for profiling, masking, and auditing your data. This workflow ensures that there is a common linkage between the detection of PII data, the remediation of said data, and your continuous compliance processes. Delta-based discovery scans (meaning they only consider data that was added or changed since the last scan took place) are also available, and may be especially useful for supporting agile development.
Data subsetting is provided, and is supported by rules-driven data masking. Rules can be applied to either columns or tags, with over eighty masking rules provided out of the box (more can be added manually). Masking always maintains referential integrity, preserves the data’s original format, is fully auditable, can be applied to millions of rows in a matter of minutes either in-place or in-flight, and is demonstrably compliant with various mandates (including GDPR) as well as the NIST cybersecurity framework. It can be applied to any number of environments at once, can be scheduled to run automatically, and displays a preview of the end results before committing them. You can also choose to retain your pre-masked data after the process has finished, allowing you to revert the process if necessary. If you change your mind, said data can be deleted easily.
Synthetic data generation is available. It relies on a user-created model of valid test data attributes that, once built, can be used to automatically generate synthetic data sets using either built-in or custom functions. These data sets can achieve up to one hundred percent coverage while still being representative of your production data. This includes generating outliers, edge cases, unexpected results, boundary conditions and negative paths. This process is also resilient in the face of schema changes and the like.
In addition, the product is designed to make test data easily accessible. For example, you can set up self-service and automated delivery of test data to testing teams; the product will dynamically build a test data warehouse (or mart) that functions as a central library of test data that can be reused on demand; and there is a ‘find and reserve’ feature that allows you to prevent data from being modified while you are preparing it for use as test data. Agile development in particular is supported via in-sprint provisioning of test data. Test data can also be delivered using a data cloning process that combines subsetting and synthetic data. This is designed to help data science teams create training data, with one or more subsets forming the bulk of the data set while synthetic data is leveraged to generate “what if?” scenarios and other cases of interest. Broadcom also has plans to exploit generative AI further down the road.
Moreover, instances of TDM can be easily and quickly deployed to multiple testing teams and/or business units with a consistent, templatised configuration via its enterprise configuration deployment functionality. This will be useful if you are moving away from the highly centralised “centre of excellence” model for enterprise test data and are instead looking at something more distributed, wherein test data responsibilities are more likely to be devolved to individual business units.
TDM can (optionally) be deployed via Docker or Kubernetes containers while retaining the bulk of its data masking, data discovery, and synthetic data functionality. Deployment of this kind is highly scalable, capable of parallelising jobs across many separate instances of TDM. Even greater scale is available for data masking in particular, wherein Kubernetes containers equipped with masking functionality are paired with the KEDA (Kubernetes Event-Driven Autoscaling) framework to automatically and dynamically provision and/or scale up containers as needed to mask across your enterprise, with minimal manual configuration needed.
TDM offers particularly mature synthetic data capabilities, including the ability to generate representative synthetic data sets via analysis of your production data. Moreover, this capability is positioned (and delivered) as a viable alternative to subsetting, rather than as merely an addendum to it. Using the two approaches together is also well-supported. In fact, its competitive data subsetting, masking and profiling capabilities in addition to its robust synthetic data generation make it an ideal solution if you want to combine synthetic data and data subsetting within a single platform.
What is more, Broadcom has a pronounced focus on serving large-scale, enterprise clients. TDM’s capabilities have developed accordingly, with this focus coming out in its emphasis on performance at scale and templatised deployment as well as its explicit scalability features, such as its use of KEDA alongside data masking containers. The company has noted that it believes data masking could become a core part of enterprise data security, but that this will only happen if masking at massive scale is a real possibility. Broadcom seems to be well on its way to making that the case.
In fact, this focus is apparent even in the capabilities Broadcom does not meaningfully offer. Database virtualisation, for example: Broadcom does offer a solution, in the form of Virtual Test Data Manager, but it is clear the product has been heavily deemphasised, and is not a priority for the company (and it only supports Oracle and SQL Server, to boot). This seems to be the case in large part because Broadcom does not believe that database virtualisation as a technology can scale up to serving enterprise customers without incurring prohibitive infrastructure costs.
Finally, TDM’s web portal provides a one-stop-shop for test data provisioning, while the test data warehouse, self-service, automated delivery, and find and reserve features facilitate reuse, collaboration, and expedient test creation. These qualities are further enhanced by its integration with Agile Requirements Designer and Service Virtualization, as well as its general capacity for integration and wide range of support for data sources. Its compatibility with NoSQL is especially notable.
The Bottom Line
Test Data Manager is a formidable, highly automated test data management solution that is especially well-equipped to serve clients that operate at enterprise scale.
Mutable Award: Gold 2024
CA Chorus
Last Updated: 1st August 2013
On the face of it, CA Chorus helps simplify, automate and streamline the management of a mainframe environment, increases staff productivity, and (with knowledge mining and knowledge transfer techniques) helps address the issues of skill silo-isation and shortage - it claims to be the leading 21st century solution for reducing the cost and complexity of managing mission-critical mainframe operations.
However, the vision is a lot more than this. What CA Chorus does for the mainframe, it could do for all technology, enterprise wide. CA Chorus could be the lens through which you see the entire operational infrastructure, from a business service delivery POV, across hybrid cloud environments, and across mainframe and distributed platforms. In essence, with CA Chorus, someone in a support role doesn't need to know what platform s/he is on or even which tool s/he is in, s/he simply gets the information s/he needs to fix the problem or escalate it to another role. We should note, however, that it is currently still mainframe-focused; although some of the tools it integrates may have distributed-systems capabilities as well.
CA Chorus is a role-based solution and CA Technologies sees building it as an incremental journey, role by role. The first role supported was DB2 database management; followed by security and compliance management, and storage management; and support for other roles (e.g. workload management) is on the roadmap. However, we feel sure the vision is much wider than this and that supported roles will extend beyond mainframe management (to some extent, they already do) - the end-point depends, in part, on the vision and maturity of its customers and their ability to address their own internal silos.
CA sells its mainframe products, including CA Chorus, largely through its global direct channel, although it has opened its partner programs up to mainframe products since mid-2012.
CA Chorus is marketed for large enterprise mainframe customers at present. Fully effective exploitation of CA Chorus needs a reasonably mature enterprise culture, we think. Current customers include EL AL, the Israeli Airline Company.
CA Chorus is an integrated, role-based, collaborative, GUI workspace with knowledge-capture and knowledge-transfer features. As a workspace, it can integrate other tools either via specific integrations built by CA Technologies or via user-driven ad hoc integrations using published APIs. It also enables the automation of management processes. At present, it manages and automates mainframe management but it is built on open (open source) integration technologies and isn't inherently limited to mainframe (or even CA Technologies) applications.
Before developing CA Chorus, CA Technologies hired an established external product design firm that had already helped design Apple products, amongst others. This firm provided basic operations research on how real mainframe managers and support staff went about their operational tasks - and CA Chorus was designed to simplify and automate these real-world operations. It is very fundamentally role-based, rather than technology or product based - it helps people in a specific set of roles carry them out, cutting across technology and product silos; it is also being developed and marketed role-by-role.
CA Technologies offers strong support for the mainframe community (see here) including the training of new mainframe support talent in its Mainframe Centre of Excellence in Prague. CA Chorus even has its own Facebook page. CA Technologies supports a rich set of user communities via MyCA; GA of CA chorus was discussed, for example, in the mainframe-software-manager global user-community.
CA Technologies offers expected enterprise levels of services and support for CA Chorus, with global 24x7x365 support if necessary, as well as classroom and web-based training. Remember, too, that CA Chorus' knowledge capture and transfer capabilities can make CA Chorus itself into an enabler for in-house training.
CA CloudMinder
Last Updated: 22nd April 2013
Despite its name, CloudMinder is not really about cloud security but about identity and access management. Well known for its on-premise IAM capabilities, CA Technologies is expanding into offering a suite encompassing in-the-cloud, to-the-cloud and from-the-cloud use cases to allow access to internal and external resources for internal and external users.
CA Technologies has a worldwide presence and an active partner program. With a large installed base of enterprise customers for its on-premise products, CA has considerable opportunities for upselling its cloud services to its installed base.
CA Technologies targets large enterprise customers.
CA Technologies has new IAM capabilities in the for-the-cloud arena, for access to cloud-based platforms such as Salesforce. It has a strong focus on the use of social identities for user convenience.
It has a suite of offerings covering many of the IAM bases, but it is new into pure cloud apps and many features are still in development. However, these should be filled out in the next 12-18 months, boosting its competitive position.
CA Technologies offers expected enterprise levels of services and support for its products, with global 24x7x365 support if necessary, as well as classroom and web-based training.
CA Service Virtualization
Last Updated: 17th December 2018
Mutable Award: Gold 2018
CA Service Virtualization is a service virtualisation solution for both simulating and testing your services. The Enterprise Edition is proprietary and available as either a desktop or web application, while the Community Edition is its freemium counterpart. CA also develops CodeSV, a Java library for enabling service virtualisation that is freely available via GitHub.
Customer Quotes
"With CA Service Virtualization, applications are tested in an integrated and real environment. Tests are only conducted in isolation. Today, we can ensure the quality and smooth performance of applications with testing scenarios that perfectly simulate the operating environment."
Telefonica
"A team of three was able to build 450 test cases that can now be automatically executed in just a few hours, which would usually have taken a much larger team several days."
AusNet Services
"CA Service Virtualization helps us deliver applications that perform on day one."
Tech Mahindra
"CA Service Virtualization allows us to reduce the total amount of time spent on development."
Qualica
CA Service Virtualization Enterprise Edition allows you to create virtual services either by recording requests and responses to and from your real services, or by importing pre-recorded request/response pairs or design specifications (for example, Swagger files). It supports a wide variety of service types and data protocols that can be extended by building custom handlers through a provided framework. It will automatically choose the correct handler to capture request/response data and can capture this data in either a stateful or stateless fashion. Data can be automatically desensitised during recording based on criteria that you specify. Once recording is finished and your virtual service is active, it will respond to any recorded requests with the appropriate recorded response. Furthermore, the solution’s “magic string” and “magic date” capabilities allow it to respond appropriately to requests that contain unrecorded date or string data. In this case, the product will extrapolate from existing patterns in recorded request/response pairs to generate a new response. In addition, Enterprise Edition supports multiple methods for matching responses to requests, including creating your own custom matching script. You can also set a specific response time for your virtual services.
Virtual Services are deployed as Virtual Service Environments (VSEs) that are monitored and controlled through a web dashboard. You can toggle whether your virtual services should intercept requests to your real services, whether they should fall through to your real service if they can’t respond to a request, and so on. You can also toggle “learning mode”. When a VSE is in learning mode, it will monitor requests and responses to its corresponding real service and update itself accordingly. Moreover, this update can be done either manually (using session tracking) or automatically. Test cases for your virtual services can also be generated automatically and executed through your web browser. They integrate with other CA products such as CA Agile Requirements Designer, as well as Selenium. Load testing is built in and can be executed against both real and virtual services.
CA Service Virtualization Community Edition is the freemium counterpart to Enterprise Edition. As with the latter, it allows you to create virtual services by either recording or importing request/response pairs. It also contains an API explorer that lets you send requests and monitor responses, allowing you to test your virtual services (albeit manually).
Finally, CA also develops CodeSV, a Java library that allows you to create and leverage virtual services from within Java, and in particular, from within your Java unit tests. It can be downloaded freely through GitHub.
Testing is an extremely important part of the software development lifecycle. Continuous testing and test automation have both seen widespread adoption (particularly as part of Agile or DevOps-oriented development pipelines) and have proven very effective at ensuring software quality. Moreover, these technologies promise to dramatically reduce both testing cost and time to market. However, despite the theoretically large reduction in testing time provided by continuous and automated testing, many organisations are still finding themselves bottlenecked by either testing or quality assurance.
There is no silver bullet for this problem. However, it’s likely that these bottlenecks can be widened significantly, and, in some cases, even eliminated entirely, by service virtualisation. Many of the delays testing and development teams experience are due to unavailable dependencies and restrictions on the use of third party services. These are the problems that service virtualisation was made to solve. CA Service Virtualization, in particular, promises a 25-50% reduction in dev/test cycle time. In fact, their results speak for themselves, with customer success stories that boast up to an 85% reduction in testing cost and up to a 90% increase in service availability for testing purposes.
The Bottom Line
CA Service Virtualization Enterprise Edition is an eminently competent, feature rich and complete service virtualisation solution. The Community Edition, while understandably stripped down, still offers basic functionality and does so within an intuitive user interface and an affordable pricing model. If you are in the market for a service virtualisation solution, there is every reason for at least one of these products to be on your shortlist.
Mutable Award: Gold 2018
Datamaker
Last Updated: 2nd March 2015
Datamaker is Grid-Tools' test data management solution. Most vendors offering test data management either do so as an appendage to software suites that are primarily focused on application testing, or as an extension to archival and information lifecycle management products. Grid-Tools was the first specialist vendor to enter this space and, for a long time, was the only vendor to offer the ability to generate synthetic data as opposed to database subsetting. Many of its major competitors still do not have this ability and Grid-Tools continues to have more advanced capabilities for synthetic data than those that do offer this capability.
It should be noted that Datamaker includes a full suite of data masking capabilities for companies that wish to use the product for database subsetting rather than generate synthetic data. Indeed, Grid-Tools directly markets a data masking solution in its own right.
Datamaker integrates with other Grid-Tools products such as Agile Designer (for automating the generation of test cases) and its service virtualisation capabilities. There is a Test Data on Demand Portal that allows users to share test data and test cases. Datamaker also integrates with third party service virtualisation products as well as testing environments from CA, HP and Bender RBT, amongst others.
Grid-Tools is not fussy about who uses Datamaker and has no particular industry focus although it does, to a certain extent, focus on the Healthcare and banking sectors, where it has had some notable success. Both of these sectors are ones that are heavily regulated and where compliance requirements around the protection of personal information are paramount.
More generically, the company's main emphasis is on companies that agree with its data-driven approach and who appreciate that agile development is as much about the data, and particularly the test data, as it is about the development processes themselves. It would argue, and we would agree, that you can't have a truly agile development process without agile test data to go with it.
In addition to its own teams, the company has an extensive network of partners across the globe, with trained staff in over 18 countries worldwide. These are split between regional resellers that serve the needs of local markets, and global strategic partners, which include CA Technologies, Accenture and HP. Partners in the Americas include Orasi Software and Softworx, while Central and South America is served by Green Light Technology. In Europe, the Middle East and Asia, partners include ANECON GmbH, Blue Turtle Technologies, Cast Info, INFA Partner, Infuse, Lemontree, Sothis Yazlim, Spica Solutions, WSM, MTP, Soflab Technology, and Software AG.
Grid-Tools has had some significant success in the financial sector though none of its major banking clients can be named. Projects range from establishing a new data warehouse to migrations. Government contracts and healthcare are also notable but again unnamed. The company's website provides a number of case studies though none of the named users will be familiar to the man on the Clapham omnibus.
Datamaker can be licensed either as an entire suite or on a modular basis. It runs on Windows, Linux, UNIX, IBM i-Series and IBM System z. Note that other products in this space do not typically support system z. There are native drivers for masking for DB2 z/OS, IMS, Oracle, SQL Server and Teradata while the Data Archive module supports DB2, MySQL (Oracle), Oracle, SQL Server and Sybase (SAP). Datamaker as a whole also supports DB2 UDB, DB2 400, Actian (Ingres), PostgreSQL, Sybase (SAP), Informix (IBM) and InterSystems Caché plus various flat file formats including Excel, VSAM/ISAM, CSV, TXT, SQL and fixed definition files. Synthetic data can be generated for HTML files but these cannot be sub-setted or masked. XML files may be created and masked but not sub-setted and this is also true for HIPAA (40-10, 50-10 and X12), EDI and SWIFT files.
Arguably, the company's major strength is that it can support all forms of test data, whether that be a copy of the original data, a subset of that data (which can be generated in various ways to ensure appropriate coverage) or by creating synthetic data that matches the profile of the original but which does not actually use any of the actual data contained within the production database. For obvious reasons, this last approach automatically resolves all data protection issues and it also means, because Datamaker stores this data in its own test data warehouse, that data can be easily and quickly re-generated at any time, without requiring any input from the DBA. As far as we know Datamaker is still the only product on the market that can do this.
This ability to generate synthetic test data explains why Datamaker is modular. If you want to use this facility then you won't need to use the product's data masking capabilities while, on the other hand, if you just want to subset data you will probably want masking (if sensitive data is involved) but not want synthetic data generation. Similarly, you may or may not require the Archival module that forms a part of Datamaker and ditto for the SOA testing module.
In addition to test data management the company also markets data masking solutions (built-in to test data management), service virtualisation (which integrates with test data management), as well as requirements definition, script-less automation and automated test case generation.
One of the strengths of Grid-Tools solutions is their robustness, and in particular their ability to integrate with existing technologies. In addition to supporting all major database types, files, and mainframe formats, Datamaker offers two-way integration with HP ALM/QC. Agile Designer further integrates with several agile project management and virtualization technologies, as well as test case design, automation, and BPM technologies. These include URequire Studio, Critical Logic TMX, VersionOne, as well as BPMN-compliant tools like Cordys and Nimbus, HP ALM/QC and BPT.
Grid-Tools offers a range of professional services which supplement the company's core product range.
Potential clients can sign up for a free 15 day trial of all of Grid-Tools primary solutions. During this period, they will receive the full support of Grid-Tools' consultants, who will help demonstrate how the solutions can most benefit their development projects.
Once a client has settled on a tool, Grid-Tools aim to help them get the greatest benefit out of it within their organisation. In addition to a full range of consultancy work packages, introductory training courses and workshops for users of all technical ability are offered, and are frequently held in local regions including the UK, USA and India.
With respect to Datamaker in particular Grid-Tools offers strategic advice, training, and support plus free trials of the software, for organizations looking to implement end-to-end test data management. This often involves a period of on-site consulting, advising on technological and procedural improvements to increase the efficiency and effectiveness of software development, while reducing costs.
Grid-Tools offers bespoke consultancy which focus on the key areas of test data management. This includes data management and provisioning, Test Mart and Test Data on Demand packages. Synthetic data generation and data cloning packages are also available. Data security concerns are addressed by masking and subsetting work packages, and a Data Masking Audit is able to assess the effectiveness, cost and consistency of existing practices. A full audit of existing systems is also available, to eliminate "technical debt", while test matching services can be provided to reduce automated test failure.
A two-day Datamaker training course is offered, to guide users from the basics of the tool to more advanced tasks. The course exposes users to how data is organized within Datamaker, providing them with an understanding of its powerful synthetic data creation functions.
Easytrieve Modernised
Last Updated: 1st June 2024
Mutable Award: One to Watch 2023
Easytrieve has developed out of what was a 4GL reporting tool, but think of Easytrieve now as a low code, highly productive, report generator with added value from file and data manipulation. The aims of the modernised Easytrieve, according to Broadcom, are to:
- Give business and systems analysts better insights, by supporting more export/import formats (including CSV and Excel);
- Support agile report development through integration with tools such as the VS Code IDE; better code quality with improved syntax checks and simplified testing; and access to more data with simplified SQL support;
- Achieve greater efficiency for IT Operations with easier admin, improved data handling and enhanced control over, for example, file allocations.
Easytrieve has now been modernised in the spirit of the Open Mainframe project and Zowe –
it provides a rich and supportive user experience by combining the best aspects of mainframe-native and distributed tooling.
Quotes
“At least this method is something I’m more familiar with... it’s close to what I do on a day-to-day basis.”
Front end developer on using VS code instead of 3270
“The code itself is not too difficult to understand and it’s well documented. Since it’s all in VS code, it’s not too daunting a task to open up and make modifications.”
DevOps Engineer
Easytrieve does what it says on the tin – it makes retrieving data and reporting on it easy, but it can also reformat and extend data. These days, Easytrieve v11.6 is available for zOS, Windows, Linux and Unix with roughly similar capabilities on each, where appropriate (obviously, the Windows-based compiler and runtime environment is only available on v11.6 for Windows, for example). Easytrieve is pretty feature-rich – a selection of features we particularly like (full list of new features here) includes:
- The profiling and code coverage toolkit for zOS performance insights;
- Language support for VS code on Windows, which should make the product‘s user experience more accessible;
- An extensive set of macros for effective data and file manipulation and/or transformation;
- Report output in XML format on all versions, which should make reporting across all platforms easier;
- Detailed technical features such as system-determined block size for efficiency/performance on xOS – being available on several platforms mustn’t compromise performance on specific platforms;
- Automation of SQL SELECT statements for efficiency/productivity on all platforms;
- Many interoperability features across all platforms, such as 8-byte binary field support, big integer value support and date format override at program level (obviously, EBCDIC/ASCII code conversion and the like is handled automatically);
- ODBC SQL support on the Windows, Linux and Unix versions, so any database is accessible.
You should care about modernised Easytrieve because silo reduction is necessary if companies are to “do more with less” and take full advantage of modern developments in IT. Broadcom has undertaken research with its customer-base showing that, unsurprisingly, organisations have difficulty introducing new developers to mainframe tooling; and that a modernized experience can increase confidence level sand shorten the ramp-up time for a new developer. Mainframe 3.0 modernisation generally helps to attract new talent to the mainframe environment and make it quickly feel at home. In addition, Easytrieve now offers a rich distributed systems capability, which reduces barriers to gaining cross-platform experience.
Although mentoring and support is still needed. One developer said “I’m comfortable with VS code but I’m not used to this syntax” and confidence increased markedly after participants had undertaken test problems in Easytrieve. One software developer said “given that I have done this now... If you asked me [to make] changes, I’d be comfortable”.
The Bottom Line
Easytrieve is well-established and trusted by those people using it. I first met it in the last century, when we used it to support end-user computing, allowing 4GL access to data in a (mainframe) transaction processing system without having to queue up for the IT group’s attention. It worked very well in that role then and has remained useful in that environment. Now, Broadcom has modernised Easytrieve, enabling DIY access to data (across more environments and not just the mainframe) for 3 personas: business and systems analysts; application and report developers; and IT operators. We think that it should, if implemented properly (with training available, and perhaps an Easytrieve champion/evangelist, the way it was done when I first met it), be a major catalyst for breaking down silos in the IT organisation. The future opportunity, we think, is to add a 4th persona, business end users, allowing managed end-user computing to take some of the load off the IT group.
Mutable Award: One to Watch 2023
Fast Data Masker
Last Updated: 2nd March 2015
Grid-Tools offers two forms of data masking: simple data masking and fast data masking, where the difference is that the former uses generic drivers and the latter uses native drivers as well as native database utilities. The functionality is also richer in GT Fast Data Masker. Both are part of the Datamaker test data management suite but the latter is also marketed as a stand-alone product called GT Fast Data Masker.
GT Fast Data Masker supports all the leading database environments as well as masking for flat files and has an emphasis not just on masking the data and auditing the results (to ensure compliance) but also on the speed of its masking. There are some relevant test results (from a real customer) posted on the company's web site. The product includes facilities to profile the data in advance of masking, to determine what data needs to be masked, as well as a comprehensive range of masking options, including a cross-referencing capability so that you can ensure consistent masking if you are masking multiple systems at the same time.
Grid-Tools is not fussy about who uses Fast Data Masker and has no particular industry focus although it does, to a certain extent, focus on the Healthcare and banking sectors, where it has had some notable success. Both of these sectors are ones that are heavily regulated and where compliance requirements around the protection of personal information are paramount.
More generically, the company's main emphasis is on companies that agree with its data-driven approach and who appreciate that agile development is as much about the data, and particularly the test data, as it is about the development processes themselves. It would argue, and we would agree, that you can't have a truly agile development process without agile test data to go with it.
In addition to its own teams, the company has an extensive network of partners across the globe, with trained staff in over 18 countries worldwide. These are split between regional resellers that serve the needs of local markets, and global strategic partners, which include CA Technologies, Accenture and HP. Partners in the Americas include Orasi Software and Softworx, while Central and South America is served by Green Light Technology. In Europe, the Middle East and Asia, partners include ANECON GmbH, Blue Turtle Technologies, Cast Info, INFA Partner, Infuse, Lemontree, Sothis Yazlim, Spica Solutions, WSM, MTP, Soflab Technology, and Software AG.
Grid-Tools has had some significant success in the financial sector though none of its major banking clients can be named. Projects range from establishing a new data warehouse to migrations. Government contracts and healthcare are also notable but again unnamed. The company's website provides a number of case studies though none of the named users will be familiar to the man on the Clapham omnibus.
Fast Data Masker comes with multiple seed tables, with internationalised versions of these tables where appropriate (such as names) and you can also add your own seed tables. There is multi-column capability so that, for example, state and zip code will match. It also includes cross-reference management so that you can, say, retain the same transformations across runs or databases. You can also define your own functions and there is support for flat file masking. The product can update data directly within the database but you also have the option of extracting data into a staging area, passing it through a masked view and then building shadow tables. Finally, there is also support for updating of primary keys which can be rebuilt automatically as well as data discovery (profiling), version control and difference management, common column discovery to ensure that the same mask is applied to matched columns, and the ability to incorporate sub-setting within the masking process.
It is worth noting the importance of file-based masking, which is often required in conjunction with database masking. For example, you might have scrambled the social security numbers in the target database. However, your input file could now contain non-matching social security numbers and the load will fail. Using the cross reference table, or the hash routines employed by Grid-Tools as a part of the masking process you can ensure that this mismatch doesn't happen.
Finally, it is important for data governance and compliance reasons that you can prove that you have appropriate processes in place to ensure the integrity of personally identifiable information. Grid-Tools provides workflow capabilities that allow you to define relevant procedures with, for example, checked, validated and approved stages to the identification of which data is to be masked. This lets relevant people look at the data profiling information and confirm that the data has correctly been identified as PII data or not as PII data. This provides a very rigorous audit trail of the due diligence taken by you in order to identify which data needs to be masked.
Databases supported by GT Fast Data Masker include Oracle, DB2, SQL Server, Ingres (Actian), Sybase (SAP), My SQL (Oracle) and Informix (IBM) as well as ODBC and flat files. In the case of DB2 support is provided for both DB2/400 and DB2 on the mainframe as well as for distributed systems. Indeed, Grid-Tools is unusual in having a product that runs directly on mainframe systems.
Grid-Tools offers a range of professional services which supplement the company's core product range.
Potential clients can sign up for a free 15 day trial of all of Grid-Tools primary solutions. During this period, they will receive the full support of Grid-Tools' consultants, who will help demonstrate how the solutions can most benefit their development projects.
Once a client has settled on a tool, Grid-Tools aim to help them get the greatest benefit out of it within their organisation. In addition to a full range of consultancy work packages, introductory training courses and workshops for users of all technical ability are offered, and are frequently held in local regions including the UK, USA and India.
In addition to the masking and subsetting work packages, a Data Subset and Data Masking training course equips users with the skills needed to reduce and mask copied data using GT Fast Data Masker and Data Subset. Users of all technical abilities can learn how to generate scripts, and how to build their own masking rules and routines.
Mainframe
Last Updated: 19th November 2013
CA Technologies offers a comprehensive set of mainframe software solutions. The mainframe remains the best platform for delivering automated IS management; with applications executing holistically across heterogeneous architectures, without silos; and with workloads optimised to run on the most effective technology available.
CA sells its mainframe products largely through its global direct channel, although it has opened its partner programs up to mainframe products since mid-2012.
CA Technologies mainframe solutuions are sol to large enterprises in all sectors.
CA Technologies offers a comprehensive suite of solutions that empower you to optimize efficiency, reduce risks and transform your cross-enterprise IT environment into a dynamic data center, comprising:
CA Chorus (see separate solution page);
- the Mainframe Software Rationalization Program;
- Workload Automation;
- Database Management for DB2;
- Database Management for IMS;
- Application Quality and Testing Tools (see separate soution page);
- Cloud and Linux on System z Management;
- Mainframe Databases;
- Output Management and Enterprise Report Management;
- Performance and Automation;Resource Management for Mainframe;
- Security Management for Mainframe;
- Software Change Management for Mainframe;
- Storage Management for Mainframe.
CA Technologies offers strong support for the mainframe community (see here) including the training of new mainframe support talent in its Mainframe Centre of Excellence in Prague. CA Chorus even has its own Facebook page. CA Technologies supports a rich set of user communities via MyCA; GA of CA chorus was discussed, for example, in the mainframe-software-manager global user-community.
CA Technologies offers expected enterprise levels of services and support, with global 24x7x365 support if necessary, as well as classroom and web-based training.
Symantec Website Security
Last Updated: 24th March 2016
Symantec's website security capabilities came from its acquisition of the authentication business unit of Verisign in 2010. Its Verisign SSL Certificate Service was rebranded as Norton Secured Seal in 2012.
Symantec operates as a certificate authority, offering SSL/TLS certificates. According to Netcraft, just under one-third of all SSL certificates worldwide were issued by Symantec. Designed primarily for enterprises, the service offers extended website security services that include malware scanning, DDoS mitigation and performance optimisation. It provides centralised management and reporting of all certificates to ensure all certificates remain valid. A code signing service is also provided for certificates to allow organisations to verify software, file and application downloads for security. An additional service offered is Norton shopping guarantee, which is a free protection service for online shoppers.
In March 2016, Symantec launched its Encryption Everywhere service, which is an SSL certificate service aimed at hosting providers. This allows hosting providers to offer end-to-end website security services to their customers, enabling them to secure applications and data in motion for all their customers' websites. They are able to manage their customers' entire certificate lifecycle management processes, as well as offering them value-added services.
Symantec operates in more than 50 countries worldwide, with 51% of its revenues generated outside of the US. It has a large roster of sales partners worldwide, although it states that its strategy is to focus on fewer, more specialised partners. It also has a wide range of partnerships with other technology vendors, as well as OEM partners and ISPs. It is currently actively recruiting partners for its Encryption Everywhere service.
Symantec's website security products are aimed at enterprise and public sector organisations through its global sales force and partners. It has customers in almost every vertical industry.
Its Encryption Everywhere product is aimed at hosting providers worldwide. Initial hosting providers signed up after the launch include InterNetX, CertCenter and Hostpoint.
SSL certificates are valid for defined periods of time only. Symantec's website security services automate the entire lifecycle of certificates, from issuance to revocation and renewal. It helps organisations - or hosting providers in the case of Encryption Everywhere - to manage their entire inventory of certificates, managing the inventory in terms of how many there are, what type they are, where they are deployed and their expiration dates, automatically replacing certificates when they expire and revoking certificates that are no longer valid or that have expired. The certificates are used to protect websites, applications and data in motion.
Other services offered as part of the website security offering include malware scanning, DDoS mitigation and performance optimisation.
All certificates provided under the service are branded by Symantec, providing visual verification that the site is safe for visitors so that they can trust the service offered. Symantec is one of the top three leading certificate authorities worldwide, with a brand that is highly trusted.
Downloads
Commentary
Solutions
- Agile Designer
- Application Delivery
- Application performance and capacity management
- Broadcom (CA) Test Data Manager and Blaze Data
- Broadcom Agile Requirements Designer
- Broadcom Test Data Manager
- CA Chorus
- CA CloudMinder
- CA Service Virtualization
- Datamaker
- Easytrieve Modernised
- Fast Data Masker
- Mainframe
- Symantec Website Security