Loyola University Chicago

Information Technology Services

Glossary of Terms



Application Services
Ground-level, reusable functions related to processing data within new or legacy application environments. Application services commonly have the following characteristics:

  • they expose functionality within a specific processing context
  • they draw upon available resources within a given platform
  • they are solution-agnostic
  • they are generic and reusable
  • they can be used to achieve point-to-point integration with other application services
  • they are often inconsistent in terms of the interface granularity they expose
  • they may consist of a mixture of custom-developed services and third-party services that have been purchased or leased

    Typical examples of service models implemented as application services include the following:
  • utility service
  • wrapper service

Business Intelligence Tools
Business intelligence tools are a types of application software designed to aid in the analysis, and presentation of data. While some business intelligence tools include ETL(extract, transform, load) functionality, ETL tools are generally not considered business intelligence tools.

Types of business intelligence tools:

  • Digital Dashboards - Also known as Business Intelligence Dashboards, Enterprise Dashboards, or Executive Dashboards, these are visually-based summaries of business data that show at-a-glance understanding of business conditions through metrics and Key Performance Indicators (KPIs). A very popular BI tool that has arisen in the last few years.
  • Online Analytical Processing, commonly known as OLAP (including HOLAP, ROLAP and MOLAP)- a capability of some management, decision support, and executive information systems that supports interactive examination of large amounts of data from many perspectives.
  • Reporting software generates aggregated views of data to keep the management informed about the state of their business.
  • Data mining - extraction of consumer information from a database by utilizing software that can isolate and identify previously unknown patterns or trends in large amounts of data. There are a variety of data mining techniques that reveal different types of patterns. Some of the techniques that belong here are Statistical methods (particularly Business statistics) and Neural networks as very advanced means of analyzing data.
  • Business performance management (BPM)

Data Model
A data model is an abstract model that describes how data is represented and used.

The term data model has two generally accepted meanings:

  • A data model theory i.e. a formal description of how data may be structured and used. See also List of database models.
  • A data model instance i.e. applying a data model theory to create a practical data model instance for some particular application. See data modeling.

A data model theory has three main components:

  • The structural part: a collection of data structures which are used to create databases representing the entities or objects modeled by the database.
  • The integrity part: a collection of rules governing the constraints placed on these data structures to ensure structural integrity.
  • The manipulation part: a collection of operators which can be applied to the data structures, to update and query the data contained in the database.

For example, in the relational model, the structural part is based on a modified concept of the mathematical relation; the integrity part is expressed in first-order logic and the manipulation part is expressed using the relational algebra, tuple calculus and domain calculus.

Data Element
In metadata, the term data element is an atomic unit of data that has:

  • An identification such as a data element name
  • A clear data element definition
  • One or more representation terms
  • Optional enumerated values Code (metadata)
  • A list of synonyms to data elements in other metadata registries Synonym ring

Data elements usage can be discovered by inspection of software applications or application data files through a process of manual or automated Application Discovery and Understanding. Once data elements are discovered they can be registered in a metadata registry.

In telecommunication, the term data element has the following components:

  • A named unit of data that, in some contexts, is considered indivisible and in other contexts may consist of data items.
  • A named identifier of each of the entities and their attributes that are represented in a database.

Data Steward
Individuals who have primary responsibility for the security of information within their department.

Enterprise Architecture (EA)
The process of translating business vision and strategy into effective enterprise change by creating, communicating and improving the key principles and models that describe the institution's future state and enable its evolution.  The practice of applying a comprehensive and rigorous method for describing a current and/or future structure and behavior for an organization's processes, information systems, personnel and organizational sub-units, so that they align with the organization's core goals and strategic direction.

Enterprise Architecture Principles
A set of guiding and value based criterion to be applied to increase consistency and quality in technology decision making.

Infrastructure is the physical components and associated software used to support the flow and processing of information.

A linkage between each part of a system such that if one part is modified, the behavior of other parts may also be affected. -- The degree to which each system module relies on each of the other modules

Loyola's Mission, Vision, and Promise

We are Chicago's Jesuit, Catholic University-- a diverse community seeking God in all things and working to expand knowledge in the service of humanity through learning, justice and faith.

Loyola University Chicago is the school of choice for those who wish to seek new knowledge in the service of humanity in a world-renowned urban center as members of a diverse learning community that values freedom of inquiry, the pursuit of truth and care for others.

Preparing People to Lead Extraordinary Lives.

Five Characteristics of a Jesuit Education

  • Commitment to excellence: Applying well-learned lessons and skills to achieve new ideas, better solutions and vital answers
  • Faith in God and the religious experience: Promoting well-formed and strongly held beliefs in one's faith tradition to deepen others' relationships with God
  • Service that promotes justice: Using learning and leadership in openhanded and generous ways to ensure freedom of inquiry, the pursuit of truth and care for others
  • Values-based leadership: Ensuring a consistent focus on personal integrity, ethical behavior in business and in all professions, and the appropriate balance between justice and fairness
  • Global awareness: Demonstrating an understanding that the world's people and societies are interrelated and interdependent

Loyola Protected Data
Loyola Protected data is any data that contains personally identifiable information concerning any individual and is regulated by local, state, or Federal privacy regulations, or by any voluntary industry standards or best practices concerning protection of personally identifiable information that Loyola chooses to follow. These regulations may include, but are not limited to:

  • Family Educational Rights and Privacy Act (FERPA)
  • Gramm-Leach-Bliley Act (GLBA)
  • Health Insurance Portability and Accountability Act (HIPAA)
  • Illinois Personal Information Protection Act
  • Payment Card Industry Data Security Standards (PCI DSS)

Loyola Sensitive Data
Loyola Sensitive data is any data that is not classified as Loyola Protected data, but which is information that Loyola would not distribute to the general public. This classification is made by the department originating the data. Examples of the types of data included are: budgets, salary and raise information, and possible properties for Loyola to purchase.

Metadata is structured, encoded data that describe characteristics of information-bearing entities to aid in the identification, discovery, assessment, and management of the described entities

Return On Investment (ROI)
The ratio of money gained or lost on an investment relative to the amount of money invested. The amount of money gained or lost may be referred to as interest profit/loss, gain/loss, or net income/loss. The money invested may be referred to as the asset, capital, principal or the cost basis of the investment.

ROI is also known as rate of profit.  Return can also refer to the monetary amount of gain or loss. ROI is the return on a past or current investment, or the estimated return on a future investment. ROI is usually given as a percent rather than decimal value.

Personally Identifiable Information (PII)
Any piece of information which can potentially be used to uniquely identify, contact, or locate a single person.


  • Social security numbers
  • Credit card and debit card numbers
  • Bank account numbers and routing information
  • Driver's license numbers and state identification card numbers
  • Personally identifiable information contained in education records
  • Personal health records

Service Orientation (SO)
A architectural approach in which a system is composed from services - cohesive, independent units of functionality whose respective responsibilities are specified in service contracts. Service orientation is intended to ensure various intrinsic system qualities such as simplicity, absence of redundancy or coupling among subsystems, flexibility, reconfigurability, and interoperability with other systems. These qualities, in turn, are expected to support the realization of a specific set of strategic goals and benefits at the business level.

Single Source of Truth
Single Source Of Truth (SSOT) refers to the practice of co-locating in one area only (e.g. one cell of one row of one table of one database) any data that can be referred to by any of the other federated applications present within the overall enterprise. Lack of this architecture in the enterprise results in incorrectly linked duplicate or denormalized data (a direct consequence of intentional or unintentional denormalization of the data model) while posing a risk for retrieval of outdated, and therefore, incorrect information. Duplicate representations of data within the enterprise would be implemented by the use of pointers rather than duplicate database tables, rows or cells. This ensures that data updates to the authoritative location are comprehensively distributed to all federated database constituencies of the larger overall enterprise architecture.

Total Cost of Ownership (TCO)
A financial estimate designed to help consumers and enterprise managers assess direct and indirect costs commonly related to software or hardware.

A TCO assessment ideally offers a final statement reflecting not only the cost of purchase but all aspects in the further use and maintenance of the equipment, device, or system considered. This includes the costs of training support personnel and the users of the system, costs associated with failure or outage (planned and unplanned), diminished performance incidents (i.e. if users are kept waiting), costs of security breaches (in loss of reputation and recovery costs), costs of disaster preparedness and recovery, floor space, electricity, development expenses, testing infrastructure and expenses, quality assurance, marginal incremental growth, decommissioning, e-waste handling, and more.

Utility service
Any generic service or service agent designed for potential reuse can be classified as a utility service. The key to achieving this classification is that the reusable functionality be completely generic and non-application specific in nature.

Utility services are used within Service Oriented Architectures (SOAs) as follows:

  • as services that enable the characteristic of reuse within SOA
  • as solution-agnostic intermediary services
  • as services that promote the intrinsic interoperability characteristic of SOA
  • as the services with the highest degree of autonomy

Wrapper service
Most often are utilized for integration purposes, they consist of services that encapsulate some or all parts of a legacy environment to expose legacy functionality to service requestors. The most frequent form of wrapper service is a service adapter provided by legacy vendors. This type of out-of-the-box service simply establishes a vendor-defined service interface that expresses an underlying API to legacy logic