Center for Government Interoperability

Documentation and Comments on Opportunities Checklist for New Software Projects

This page contains detailed documentation and comments regarding the checklist for opportunities when new tables or fields are designed into a business system.

Legacy systems (Data Architect)

Legacy systems should never come into being. This may appear overly critical, but whoever did not continually refactor the data model to meet ongoing requirements failed, on a continual basis, to support the system's core mission. Smaller legacy systems can either be fixed or replaced with little impact, but huge systems, which may also be integrated throughout external government organizations, should not be replaced by a brand new system. It is too disruptive even if government or contractors are great software developers. Instead, the data model should be refactored one step at a time until it meets client requirements and is correctly normalized so that it becomes agile enough to meet future requirements. It takes careful planning but this prevents skyrocketing costs and keeps all of the hidden pieces running such as print jobs and reports that requirements gathers for any proposed new system miss. Then the main focus of the IT shop is to continually keep the data model in 3rd Normal Form in a way that makes it more future proof. This requires work and implementation of repeatable procedures but it protects the application from drifting into obsolescence. Not only is this method good for fixing legacy issues, it is critical to have these repeatable processes in place for recently installed systems that will eventually decay without them.

The few reasons for completely rewriting a legacy system from scratch include situations where the database software or computer language it is written in are no longer supported by the organization that created it.

How to repair a legacy system one step at a time? Example: Suppose you have a professional licensing system that licenses doctors, pharmacies, security guards, etc. and the data was incorrectly modeled so that the party that has the license was a sub-type of the license instead of the other way around (the problem being that multiple versions of the same person exist if they own more than one license). Working in tiny steps, the first thing to do is simply create a centralized “Party” table and analyze which routines need to be converted to access the centralized table instead of the incorrectly modeled data. (See Discussion Ideas for Mitigating Risk During Software and Hardware Updates). Draw a map of all routines that require the change and integrate them into the new model one at a time. Do this for each table until the whole legacy system is normalized. You will now have

  1. A system cheaper and better than any new system that could possibly replace it and
  2. Experience with, and procedures that prevent system decay for all of your applications.

Correct data modeling (3rd normal form), correct keys (Data Architect, IT)

Normalizing data creates the most productive possible solutions by unlocking process oriented systems so that organizations can connect data across government to improve business processes.

Normalizing data will always produce optimal results without errors or exceptions. Less programming, faster project completion, less maintenance, fewer mistakes, more flexibility for the organization, more opportunities.

Simply bringing organizational data into third normal form resolves business problems throughout the organization and improves IT alignment to the organization's mission in the most powerful and cost effective way.

Benefits of normalization:

  • Ensures that customer requirements are properly satisfied and that new requirements are easier to accommodate
  • Makes the database easier to maintain
  • Ensures structural stability of data
  • Prevents various updating anomalies which can occur in non-normalized record structures
  • Enables record processing by a set of simple operators.
  • Elimination of redundant data storage.
  • Close modeling of real world entities, processes, and their relationships.
  • Structuring of data so that the model is flexible to give the business side agility.

Business rules should be taken out of programming code and put into normalized tables so that they are de-siloed and available to be shared enterprise-wide. Legacy systems should be included because even when they are replaced, conversion will be far easier when their files are normalized. Also, maintenance headaches from these systems will be greatly reduced.

Continuous normalization will move IT towards greater alignment to the organization's mission where IT can nimbly deliver new features and solve problems in the most efficient way.



Interoperability: Cross agency data and process sharing opportunities, including, API, SOA, NIEM, and Web Services (Data Architect, IT and business side)

Is there a structured communication process with potential stakeholders available? Every potential integration point with other systems should be evaluated for loose vs. tight coupling, where any doubt is resolved in favor of loose coupling.

Has the new data or system been analyzed regarding how it may play a role in enterprise-wide interoperability?

Can an additional "organization_ID" field be added so that multiple agencies can concurrently use tables or programs as a centralized "cloud" application? (Executive office, CIO, Data Architect, IT and business side)

Beyond security and performance considerations, the additional code to make this application concurrently shared may not require much programmer effort.

The concept that enables the creation of cloud software is that only one extra field per database record is needed to make it sharable. For example, if an inventory system has these fields: item description, location and serial number, then all that's needed to make the table sharable for every government organization in the whole state is to add an organization_ID field to it. This is a simplification, but it gives the general idea. It keeps all of the data logically separate so that each governmental entity only sees data that pertains to it, but since all of the data is physically in one table, legislators, analysts and budget officials can obtain unfettered business intelligence from the database.

Cloud opportunities: Is there an business app cloud that this can become a part of? Should a business app cloud site be considered (statewide, nationally or for the general private business world)?

Can this software system be obtained from (a) open source (b) GOTS or (c) shareware sites? (Data architect, IT)

Can this software system be obtained from open source, GOTS or shareware sites?

Can this software become (a) open source (b) GOTS or (c) shareware? Make this available to all as open source.

GOTS is the abbreviation for government off the shelf software.

The choice used be "Build it or buy it?" That is old fasioned. The new concept should be: Build if it doesn't exist yet and make it open source so no other government organization has to ever buy it again.

Can any business processes be improved with existing data or be improved by changing the data model? Are there opportunities to re-engineer small, medium, or large, core, foundation business processes with new data models? (Data Architect, IT and business side)

For example, can processes that are paper-based bottlenecks be eliminated with already-existing data that should be integrated?

The purpose of this check is to ask if this is a good time to bring together the business process improvement side with the IT side to conduct performance reviews of, and consider reengineering, business processes. It is intended to encourage analysts to avoid assumptions and logic traps and instead to trace the chain of processes through layers of abstraction to determine if the organization's mission can be more optimally achieved.

Can this data be replaced by a better source of data elsewhere or replace other data? (Data Architect, IT, business side and federal or state CIO)

Can whole tables be eliminated by consolidation and sharing?

For example, if there is a list of government agency account numbers duplicated throughout state government, the redundant lists could be replaced by a link to a centralized list or receive automated downloads when the centralized one is updated. This would save manual editing work and reduce errors produced through manual updates. Does NIEM have a better source?

Prioritize super connector fields and super connector tables. (Data Architect, IT and business side)

Super connector fields are those that cross agency boundaries.

Super connector tables are the most important tables that can be shared in-house and externally. These must be listed in a transparent, centralized database and reviewed for use whenever a new system is designed.

Vendor created systems must be reviewed for table/data sharing and naming conventions.

Can this data be used to validate other data or does it need validation performed on it? (Data Architect, IT and business side)

Business Intelligence - data mart/data warehouse opportunities (Business side)

Is there value to adding this to a data warehouse? Should a data warehouse be created if there is none? Can business clients be brought together to collectively discuss their requirements and justification for a data warehouse?

Can clients use the data to analyze business trends, discover fraud, or populate geospatial systems?

Are there informatics uses such as geoinformatics, criminal justice informatics, and business informatics?

Data harmonization problems or opportunities (Data Architect, IT and business side)

Semantic interoperability opportunities exist throughout all organizations, for example, "Distributed Cost" is called "Cost Allocation" elsewhere in the same organization. These terms should be unified and standardized enterprise wide to simplify business processes. Potential candidates are same words with different meanings, and different words with the same meaning. Does NIEM already have information on this field?

Standards evaluation - Are there standards to be adhered to or created? (Data Architect, IT and business side)

Data standards, business standards, naming conventions, etc. For example, every state agency could have the same standard for this field: Corporation_Tax_ID_Number 40 characters - alphanumeric.

If data is exported externally, check to see if already has a standard name for this field.

Are field, program and table names self-documenting and have consistent naming conventions?

Does this reveal the need for new standards which need to be created

Alignment to organizational mission. Strategic planning problems or opportunities (Data Architect, IT and business side)

Enterprise Architecture planning. How does it align with to the "To Be" architecture?

Impact on other systems (Data Architect, IT and business side)

Entered into dependency database (what systems does it impact or is it impacted by). This might include change control board system, ITIL, etc.

Metrics generation opportunities (business side)

Can this field or table create useful metrics or appear on a dashboard? Customers could include boards, licensees, the public, finance, governor's office and the legislature

Metadata opportunities (business side)

KPI - Key performance indicators opportunities (Data Architect, IT and business side)

Are there opportunities to use the field/table to measure performance?

Risk (Data Architect, IT and business side)

Can some data be put into production within strategically staggered time frames to reduce risk?

See risk-management-when-implementing-change.htm

Security (Data Architect, IT, business side and ISO)

Does the data require encryption, authentication, and authorization? Should backups be encrypted? What controls should be applied?

Should the security department review this matter?

Is data subject to legislative oversight or mandates? (Data Architect, business side)

E.g., Health Insurance Portability and Accountability Act (HIPAA), California Database Breach Act (California SB 1386),, FIPS, HSPD-12.

There needs to be a table of federal, state and departmental regulatory mandates or voluntary guidelines that reviewers check data against

Should clients be given control of the data? (Business side, IT)

Giving clients more control over their data, where feasible, often improves process efficiency.

Would a data steward be useful for this data? (Data Architect, IT and business side)

Backup considerations

How often should it be backed up? How does it get refreshed when there is a crash? When should it be purged? (IT and business side)

Can data quality be improved? Is data cleansing applicable? (IT and business side)

Why is the data not clean? Are there collaboration opportunities with outside agencies to validate the data?

Is data analyzed to identify when quality threshold or target levels are not being achieved?

Quality management (Data Architect, IT and business side)

Are clients satisfied? Is quality management and continual process improvement built into this system?

All data and software apps should have someone assigned to be responsible for them. This is to avoid responsibility ambiguity.

There should also be a method for stakeholders to conveniently make suggestions and contact responsible staff for all data and apps.
Convenience in making suggestions is a key factor in ensuring that administrators receive feedback in a timely and consistent manner.

The responsible person for the data and apps would be assigned to:

  1. Create the suggestion method (suggestion box, suggestion email, online forum, etc.) and ensure that it is convenient to use
  2. Communicate to stakeholders the location or method for making suggestions and make the location or method easy to locate
  3. Review incoming suggestions and forward suggestions to appropriate staff responsible for improvements, and notify suggesters that suggestions were received
  4. Bring suggestions to a disposition and notify suggesters of the disposition (a) implemented (b) rejected (c) on back burner; waiting for analysis when more data or opportunities are available, etc.
  5. Every software application should have its own built in suggestion box on the application itself so that clients can intuitively use it to send suggestions directly to those responsible for the application at the time of use.

Suggestions should flow to into some centralized database for analysis in order to discover overarching patterns and enterprise-wide improvement opportunities.

Automated duplicate detection (IT)

Timeliness (IT and business side)

Is there value to the organization if the data is refreshed sooner or by other ways?

Is the data coming from the best sources (lineage; most reliable, timely)?

Review for entry into a table of future opportunities and linked to a calendar of related opportunities or future change events (Data Architect)

If a related component was scheduled for updating, that would trigger an automatic reminder to review opportunities for this component.

For example, assume two large systems have a data sharing opportunity but there is no budget for implementation. Add the opportunity to a "waiting list" of opportunities. In the future, when there is a work being done on the system that can include the task, or a budget is available to implement the opportunity, the data sharing can be realized.

Specify the priority on architect's data design waiting list.

Optimization by combining multiple projects (past, future or ongoing projects) (Data Architect, business side, IT)

Are there other projects going on elsewhere in the state or federal government that would benefit from combining with this one?

Are there opportunities from making this available to a broader audience? (Data Architect, IT and business side)

Customers that are not immediately evident could include government boards, licensees, the public, NIEM, police and investigative organizations, finance, governor's office, and the legislature. See Open Data and Open Government.

Audit policy - should the field or table be have its edit or use history recorded (IT and business side)

Error management. How should the system respond to the errors? E.g., should stakeholders or IT be notified if there are errors in the data? (IT and business side)

Are the business rules table-driven and controllable-by-clients where feasible? (Business rules should not be in programming code.) (IT, Data Architect)

Data tuning - is it fast enough? Is denormalization required? (IT)

Generally, data should be derived in real time when needed where feasible, however response time considerations denormalization and other techniques should be considered.

Field size and type - will the field be able to contain larger amounts of data 10 years from now? Are there auto alerts to notify clients and developers when files reach sizes large enough to overload the system? Are tables auto-tuned? Is it the right type of field (alphanumeric, numeric, etc.) (IT, Data Architect)

Note: this checklist is not a data dictionary.

Field and table added to data dictionary (IT, Data Architect)

Restartability (IT)

Have data and system been designed so that crashed programs restart without concern for partially filled data or partially completed processes? Has documentation been clearly communicated to clients and IT staff that there a no problems with process restarts?

Energy savings (Green unit, business side, IT)

Has the system designed to make default reports online instead of paper? Has server virtualization been considered, etc.?

Unstructured data links - Can HTML links be made to useful web sites at strategic places on the application? (Business side)

Web pages are unstructured data but may give consumers a richer information environment. Example, a government professional licensing application may contain physician license data for public verification. If the physician works for a corporation, a link to the state agency that licenses corporations where data regarding the corporation is located may give consumers a more integrated experience.

Does a new business process need to be created to improve data or add new data? Does a committee of business and IT members need to be created to analyze and implement the new business process to improve or create additional data? (Data Architect, IT and business side)

Transparency (Business side)

Can the system bring more transparency to the organization? How can more transparency be built into the system and the organization regarding any related business processes? ,,

New mobile app opportunities? Are there new opportunities to use mobile apps with this data or program? (Data Architect, IT and business side)

ADA (IT side)

The Americans with Disabilities Act (ADA) generally requires that state and local governments provide qualified individuals with disabilities equal access to their programs, services, or activities unless doing so would fundamentally alter the nature of their programs, services, or activities or would impose an undue burden. One way to help meet these requirements is to ensure that government websites have accessible features for people with disabilities.

Enterprise Arhitecture (Enterprise Architect)

If your organization has an enterprise architecture program, does the new business process, IT system, or data need to be incorporated into the EA process? (Data Architect)

Core mission software built built by outside organizations (CIO, Contracts Division, Legal, Data Architect)

New Core Mission Software Systems purchased from outside organizations – does government retain control of the data model and application?

New Software Systems – if this is a core mission software system, retain ownership of your data model and application. Never use an outside vendor's proprietary data model because a data model is the mechanism for delivering IT services to clients. Once access to change the data model is handcuffed, there is no agility or way to easily add new services. Your data modeler might as well not exist. Critical types of vendor incentives to innovate are gone and client is locked forever into rigid contractual and logistic constraints.