It Data Quality Architect /system Analyst Resume
Basking Ridge, NJ
SUMMARY
- 22+ years of experience in designing and implementing cutting - edge industry standards
- Innovative, visionary backed by experience track records in leveraging technologies towards accomplishing strategic and tactical business objectives, enhancing organizations resources in implementing major expense-reducing, revenue-enhancing, competitive and business flexibility improvements.
- Enterprise Data Management Services: Data Governance, Data Quality Management, Data Warehousing, Data Integrations, Business Intelligence, Customer Relationship Management and Enterprise Resource Planning.
- Strong hand on experience of modern architecture and legacy technologies, exceptional analytical skill, learning agility and effective communication skills, self-motivated and determined, success oriented and natural leader as well as excellent team player, trustworthy and hardworking.
- Significant experience in IT applications and data processing with Healthcare, Financial and Insurance services and Automotive industry for 15+ projects through all phases of the software development life cycle (SDLC) including requirements analysis, design, configuration, integration, testing, deployment, and maintenance.
TECHNICAL SKILLS
Data Governance: IBM InfoSphere Business Glossary 9.1x, Blue Print Director 2.2.0, MS Azure Data Catalog, Business Glossary, Data Warehousing, Data Integrations, Application Development
Data Quality: IBM InfoSphere suite; Information Analyzer 9.1x, Discovery Studio 4.x, Data Quality Console, SAS DATAFLUX Data Management Studio 4.2, INFORMATICA Data Explorer 9.6, Data Quality 9.6, Developer. MSSQL Data Quality Services.
Business Intelligence: COGNOS Report Studi0 10x, Tableau, MS PowerBI, Business Objects
Data Integration: INFORMATICA Power Center 9.5, IBM InfoSphere Data Stage 11.x, Quality Stage
ERP: SAP BI 7.0, BW 3x SD, ECC6.0.
Database: DB2, UDB, Oracle, Dbase IV, MS Access, MySQL 5x. TERADATA
Software: SPUFI, QMF, File-Aid, SCLM change control, MQ Series, IBM Debugger, XPEDITOR, sync sort, SQL Developer, TOAD, HP Quality Center 9.2, MS Office suite 2013, IBM Data Studio 3.2.x, JDA, XPC ( Xpress Commerce)
Languages: COBOL, CICS, JCL, C, SQL, PL/SQL, JavaScript, HTML, DHTML, CSS, JAVA, XML XSL. Python.
Regulatory Compliance: BSA/AML, CCAR, DFS 504, Sarbanes Oxley and knowledge of GDPR.
Systems: MVS/ESA, Z/OS, OS/390, TSO/ISPF, JES2 & JES3, Windows 7, MS DOS, UNIX, Linux.
PROFESSIONAL EXPERIENCE
IT Data Quality Architect /System Analyst
Confidential - Basking Ridge NJ
Responsibilities:
- Actively participates in the organization’s reporting delivery and data governance processes, tools and programs
- Works with leadership to identify data domains and domain leads and assists in the development and delivery of appropriate training across the clinical, operational and financial domains
- Works with business owners to define and establish data quality rules, definitions and strategy consistent with department and organizational strategies and goals
- Participates in the establishment of standards for measurable data quality and implements appropriate tools to monitor quality, completeness and adherence
- Works collaboratively with Data Stewards from the business, clinical and IT areas to ensure data quality and availability while identifying gaps to be remediated.
- Develops and compiles Data Governance Metrics from all domains and reports to the Data Governance Council.
- Balance short-term versus long-term actions, strategic versus tactical requirements, while continuing to move forward towards the strategic vision; participate in the road map to achieve the vision.
- Interact with Data Owner; provide framework, stewardship, governance and decision making for the management of the enterprise data quality for project development teams, business users and other stake-holders.
- Advocate Data Quality best practices and standards, process flow, architectural roadmap, thresholds and tools that promote and facilitate a seamless process and act as change agent with end to end solutions.
- Define governance and best practices around meta-data to ensure an integrated definition of data for enterprise information, and to ensure the accuracy, validity, reusability, and consistent definitions for common reference data.
- Hands on experience with Enterprise Data Quality Management Life cycle; Discovery, Profiling, Measure, Establish Rules, Monitor, Report and Remediation.
- Establishing Healthcare industry Data Rules to monitor data quality and experience on working with large data sets, exceptional analytical, conceptual and problem-solving abilities
- Hand on experience with Data Profiling to underpin a variety of information management programs, including data quality assessment, data quality validation, metadata management, data integration and ETL, data migrations, and modernization project.
- Author and developed data analysis presentation, distribution of data quality analysis review sessions and data measures and remediation process both up and down stream.
- Developing and managing Data Quality/Data Profiling Program with organization using IBM InfoSphere suite; Information Analyzer, InfoSphere Discovery, DataStage, Quality Stage and SQL’s to evaluate complexity for the business domain for data governance and developed enterprise-level data analysis, discovery of hidden sensitive data.
- Design, develop, and implementation of Enterprise Data Quality Dashboard with an ability to monitored and obtains continuous visibility into the quality of data required for key business processes and can react immediately to Data Quality exceptions and issues to correct them.
- Advance skills with spreadsheets, project management and presentation software, including a strong working knowledge of MS Excel, MS PowerPoint, MS Project and MS Visio.
Environment: IBM InfoSphere Suite; Information Analyzer 9.1x, Discovery Studio 4.5x, Data Quality Console, Quality Stage, Data Studio 3.2.x, Business Glossary 9.1.x, Blue Print Director 2.2.0, COGNOS Report Studio, DB2, Mainframe Copybook, UNIX Shell Scripting.
Senior IT Data Quality Analyst /Data Analyst/ ETL Architect
Confidential
Responsibilities:
- Responsible for analyzing, designing, developing strategies and technical solutions to improve the accuracy of data, redundancies, user inferences and identifying anomalies in data at column, table and cross-table level understand importance of actual quality, content and structure of data and critical business data elements require by the business domain.
- Experience in listing the critical expectations, methods for measurement, and specifying thresholds, which the business clients can associate to data governance with levels of success in their business activities, Articulated specific achievements or milestones as success criteria allows managers to gauge individual accountability and reward achievement.
- Engage as a Subject Matter Expert (SME) on data quality improvements. Proactively promote consistency of data quality management goals, standards, policy, procedures, tools and techniques.
- Experience working with large data sets, exceptional analytical, conceptual and problem-solving abilities. Ability to profile and validate data sets for disparate data sources.
- Responsible for developing and managing Data Quality/Data Profiling Program with organization using IBM Info Sphere suite; Information Analyzer, Info Sphere Discovery, Data Stage, Quality Stage, INFORMATICA Data Explorer, Data Quality and SQL’s to evaluate complexity for the business domain for data governance and developed enterprise-level data analysis, discovery of hidden sensitive data.
- Authored and developed presentations, distribution of data quality analysis review sessions and data quality scorecard, which conveys key accuracy and completeness, measures for different data sources, key measures and remediation process both up and down stream.
- Authored and developed data quality analysis dashboard using Tableau, which projects data anomalies and health of data quality at table, attribute level.
- Development of data control movement (ETL) between source and target using IBM Info Sphere Data Stage and automation of data stage job using Shell Scripts. Development different level of reports to evaluate the data quality enterprise level.
- Advance skills with spreadsheets, project management and presentation software, including a strong working knowledge of MS Excel, MS PowerPoint, MS Project and MS Visio.
Environment: IBM InfoSphere Suite; Information Analyzer 11.1x, Discovery Studio 4.5x, Quality Stage. INFORMATICA Data Explorer 9.5, Data Quality 9.5, Oracle, Oracle SQL Developer, Mainframe CopybookXML, XSL, UNIX Shell Scripting. Omni Plus, Teradata, MS SQL Server, Tableau Desktop, Server
Senior Programmer Analyst/ Technical Lead Offshore/ ETL Developer/QA Lead
Confidential
Responsibilities:
- Involved in all phases of building Data Warehouses, Data Marts, Data Modeling Star Schema and Snowflake Dimensional model, analyzing the business requirements, developed ETL process both legacy and modern architecture.
- Extensively used Informatica Power Center Designer, Workflow Manger and Monitor designed and developed Informatica maps to populate into the data mart and send notification to downstream system in xml messages using web service transformer. Developed standard and re-usable mappings and maplets using various transformations like Expression, Aggregator, Joiner, Source Qualifier, Router, Lookup, and Filter.
- Profiled the data to validate business understanding of the data and pointed out inconsistencies if any. Designed, created and implemented jobs to detect any data inconsistencies as In-Line and standalone processes. Designed and implemented Error Handling and Reject management closed loop functions.
- Used COGNOS Impromptu extensively to create ADHOC reports to check measure and matrix.
- Designed mechanism to send alerts as soon as the jobs failed to PSO and the respective developers. Provided 24/7 support during the various test phases of Dev. to Dev., IST, END to END, and production support during the release Phase.
- Extensively worked on migrating INFORMATICA maps from development to test and to production environments. Involved in setting up of war room sessions for the closure of issues.
- Designed mechanism for identifying reject or bad records during the loading process. Scheduled the maps using Workflow Manager, which are controlled by INFORMATICA engine and also for monitoring and performance statistics of each stage.
- Responsible for creating complete data archiving process and setting up jobs for data retrieval for NHTSA reporting. Created Stored Procedures to transform the data and worked extensively in PL/SQL for various needs of the transformations while loading the data.
- Performed Unit Testing, Integration Testing, regression testing and User Acceptance Testing (UAT) for every code change and enhancement.
- Worked on improving the performance of the designed jobs by using various performance tuning strategies.
- Written various UNIX (Korn) Shell Scripting, Perl Scripting for scheduling the jobs.
Environment: IBM ES9000, COBOL-II, DB2, Store Procedures, Triggers, IDMS, CICS, MVS/ESA, JCL. TSO, OS/390, zOS, SYNCSORT, VSAM, IBM Debugger, SCLM Change control, RACF, MQSeries, FILEAID, QMF, SQL, SPUFI, Control-m, FTP, SFTP. UNIX, UDB V8, Oracle 10g, TOAD, XML, XSL, HTML, DHTML, TomCat, Transbase5, MS Office, COGNOS Series 7.x, Impromptu administrator, Impromptu user, Impromptu web reports, Power play user, power play webs, Report Net suite, INFORMATICA Power Center V8x.
Data Governance Analyst
Confidential
Responsibilities:
- Data Governance Analyst is responsible for providing information services, discovery, development and implementation of Citi’s TTS (Treasury and Trade Solution) Data Quality Management’s Chief Data Office (CDO) DQ Standards Operating Model. By delivering a measurement point for Critical Data Coverage, TTS Scaled DQ Monitoring Program. CDO Standard Operating Model (DQSOM) and TTS Data is governed by enterprise standards and regulatory requirements for modern DQ to monitor data at its source as early as possible to obtain and address quality issues early in the CCAR Critical Data Elements lifecycle.
- Work with Business Process Owners and the Data Governance User Group(s) to identify, classify and define each assigned Critical Data Element (CDEs) and ensure that each element has a clear and unambiguous definition.
- SME on enterprise Treasury and Trade Solutions data domains, CCAR and Target Record Layout attributes regional or business line organization structures and product categories.
- Collaborate with functional units to identification and collection of business and technical metadata, data calculation, sourcing, and transformation rules, and establish data lineage for delivery mappings, data standards in a centralized data governance repository.
- Worked with Data Custodians, Business Process owners and IT, in execution and implementation of Citi Enterprise-wide Data Quality Standard Operation Model and Citi Data Management Policy for regulatory compliance, operational reporting is fit for purpose and socializes progress among stakeholders.
- Partner with IT to implement data standards, business rules and DQ issues and escalation procedures
- Initiate and oversee changes (i.e. creation, modification, deletion) to CDEs, recommend the prioritization and identify the impact for the key stakeholders.
- Identifying risk mitigation controls and collection evidence to support the attestation process.
- Experience gathering a documenting requirements for data-related projects, such as data warehousing, business intelligence or operational reporting.
- Conduct working groups to remediate data issues
- Produce metrics to support transparency into data delivery and controls in accordance with Data Management Policies and Standards
- Strong analytical and time management skills
- Excellent written and verbal communication skills and the ability to interact with variety of customers and stakeholders
- Intermediate facilitation skills with the ability to drive issues to closure
- Self-motivated and able to handle tasks with minimal supervision or questions
- Ability to deliver a high level of customer service
Environment: MS SQL Server 2014, Compass, Rules Harmony, Enterprise DQP- AbInitio based platform