Roy Fisher and Data management

For the American journalist and Chicago Daily News Editor-in-Chief (1918–1999), see Roy M. Fisher.

Roy Fisher (born 1930) is a British poet and jazz pianist. He was one of the first British writers to absorb the poetics of William Carlos Williams and the Black Mountain poets into the British poetic tradition. Fisher was a key precursor of the British Poetry Revival.

Contents 1 Life 2 Work 3 Bibliography 4 References 5 External links

Life

Fisher was born in Handsworth, Birmingham and studied at the University of Birmingham. His early work, including City (1961), a work in which he applies the lessons of Williams' Paterson to the city of Birmingham, was admired in the United States but more or less ignored in his native country. It was because of the negative connotations for outsiders of "Birmingham" that the city's name did not once appear in his early long poem City . In 2005 Roy Fisher was elected a Fellow of the Royal Society of Literature. Work

Fisher finally began to gain recognition in Britain with the publication of Poems 1955-1980 (1981). Between 1963 and 1971, he worked as Head of English and Drama at Bordesley College of Education. He then moved to the Department of American Studies at Keele University. He retired in 1982, after which he worked as a freelance writer and as a musician.

Fisher's later works include the long poem A Furnace (1986), Poems 1955-1987 (1988), The Dow Low Drop (1996), and Standard Midland (2010).

Outside the mainstream, Fisher is regarded by poets such as John Ash, Alan Baker, Peter Robinson and critics like Marjorie Perloff as one of the most important post-war English poets. News for the Ear: A Homage to Roy Fisher edited by Peter Robinson and Robert Sheppard appeared in 2000, and a book of critical essays, The Thing about Roy Fisher, edited by John Kerrigan and Peter Robinson, was published the same year. Bibliography City (Migrant Press, 1961) Ten Interiors with Various Figures (Tarasque Press, 1966) The Ship’s Orchestra (Fulcrum Press, 1966) Collected Poems (Fulcrum Press, 1968) Matrix (Fulcrum Press, 1971) The Cut Pages (Fulcrum Press, 1971; Shearsman, 1986) The Thing About Joe Sullivan: Poems 1971-1977 (Carcanet Press, 1978) Poems 1955-1980 (Oxford University Press, 1980) A Furnace (Oxford University Press, 1986) Poems 1955-1987 (Oxford University Press, 1988) Birmingham River (Oxford University Press, 1994) It Follows That (Pig Press, 1994) The Dow Low Drop: New & Selected Poems (Bloodaxe Books, 1996) The Long & the Short of It: Poems 1955-2005 (Bloodaxe Books, 2005) Standard Midland (Bloodaxe Books, 2010) Selected Poems ed. August Kleinzahler (Flood Editions, US, 2010) The Long & the Short of It: Poems 1955-2010 (Bloodaxe Books, 2012)

Data management and Roy Fisher

Data management comprises all the disciplines related to managing data as a valuable resource.

Contents 1 Overview 2 Corporate Data Quality Management 3 Topics in Data Management 4 Body of Knowledge 5 Usage 6 Integrated data management 7 See also 8 References 9 External links

Overview

The official definition provided by DAMA International, the professional organization for those in the data management profession, is: "Data Resource Management is the development and execution of architectures, policies, practices and procedures that properly manage the full data lifecycle needs of an enterprise." This definition is fairly broad and encompasses a number of professions which may not have direct technical contact with lower-level aspects of data management, such as relational database management. The data lifecycle

Alternatively, the definition provided in the DAMA Data Management Body of Knowledge (DAMA-DMBOK) is: "Data management is the development, execution and supervision of plans, policies, programs and practices that control, protect, deliver and enhance the value of data and information assets."

The concept of "Data Management" arose in the 1980s as technology moved from sequential processing (first cards, then tape) to random access processing. Since it was now technically possible to store a single fact in a single place and access that using random access disk, those suggesting that "Data Management" was more important than "Process Management" used arguments such as "a customer's home address is stored in 75 (or some other large number) places in our computer systems." During this period, random access processing was not competitively fast, so those suggesting "Process Management" was more important than "Data Management" used batch processing time as their primary argument. As applications moved into real-time, interactive applications, it became obvious to most practitioners that both management processes were important. If the data was not well defined, the data would be mis-used in applications. If the process wasn't well defined, it was impossible to meet user needs. Corporate Data Quality Management

Corporate Data Quality Management (CDQM) is, according to the European Foundation for Quality Management and the Competence Center Corporate Data Quality (CC CDQ, University of St. Gallen), the whole set of activities intended to improve corporate data quality (both reactive and preventive). Main premise of CDQM is the business relevance of high-quality corporate data. CDQM comprises with following activity areas: Strategy for Corporate Data Quality: As CDQM is affected by various business drivers and requires involvement of multiple divisions in an organization; it must be considered a company-wide endeavor. Corporate Data Quality Controlling: Effective CDQM requires compliance with standards, policies, and procedures. Compliance is monitored according to previously defined metrics and performance indicators and reported to stakeholders. Corporate Data Quality Organization: CDQM requires clear roles and responsibilities for the use of corporate data. The CDQM organization defines tasks and privileges for decision making for CDQM. Corporate Data Quality Processes and Methods: In order to handle corporate data properly and in a standardized way across the entire organization and to ensure corporate data quality, standard procedures and guidelines must be embedded in company’s daily processes. Data Architecture for Corporate Data Quality: The data architecture consists of the data object model - which comprises the unambiguous definition and the conceptual model of corporate data - and the data storage and distribution architecture. Applications for Corporate Data Quality: Software applications support the activities of Corporate Data Quality Management. Their use must be planned, monitored, managed and continuously improved. Topics in Data Management

Topics in Data Management, grouped by the DAMA DMBOK Framework, include: Data governance Data asset Data governance Data steward Data Architecture, Analysis and Design Data analysis Data architecture Data modeling Database Management Data maintenance Database administration Database management system Data Security Management Data access Data erasure Data privacy Data security Data Quality Management Data cleansing Data integrity Data enrichment Data quality Data quality assurance Reference and Master Data Management Data integration Master data management Reference data Data Warehousing and Business Intelligence Management Business intelligence Data mart Data mining Data movement ( Extract, transform, load ) Data warehouse Document, Record and Content Management Document management system Records management Meta Data Management Meta-data management Metadata Metadata discovery Metadata publishing Metadata registry Contact Data Management Business continuity planning Marketing operations Customer data integration Identity management Identity theft Data theft ERP software CRM software Address (geography) Postal code Email address Telephone number Body of Knowledge

The DAMA Guide to the Data Management Body of Knowledge" (DAMA-DMBOK Guide), under the guidance of a new DAMA-DMBOK Editorial Board. This publication is available from April 5, 2009. Usage

In modern management usage, one can easily discern a trend away from the term 'data' in composite expressions to the term information or even knowledge when talking in non-technical context. Thus there exists not only data management, but also information management and knowledge management. This is a misleading trend as it obscures that traditional data are managed or somehow processed on second looks. The distinction between data and derived values can be seen in the information ladder. While data can exist as such, 'information' and 'knowledge' are always in the "eye" (or rather the brain) of the beholder and can only be measured in relative units.3 Integrated data management

Integrated data management (IDM) is a tools approach to facilitate data management and improve performance. IDM consists of an integrated, modular environment to manage enterprise application data, and optimize data-driven applications over its lifetime. IDM's purpose is to Produce enterprise-ready applications faster Improve data access, speed iterative testing Empower collaboration between architects, developers and DBAs

Consistently achieve service level targets Automate and simplify operations Provide contextual intelligence across the solution stack

Support business growth Accommodate new initiatives without expanding infrastructure Simplify application upgrades, consolidation and retirement

Facilitate alignment, consistency and governance Define business policies and standards up front; share, extend, and apply throughout the lifecycle See also Open data Information architecture Information management Enterprise architecture Information design Information system Controlled vocabulary Data curation Data retention Data governance Data quality Data modeling Information lifecycle management Computer data storage Data proliferation Digital preservation Digital perpetuation Digital perpetuation Document management Enterprise content management Hierarchical storage management Information repository Records management System integration
90/511 88 89 91 92 93