We provide IT Staff Augmentation Services!

Data Analyst Resume Profile

3.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY

  • Project Lead with10 plus years of experience working in Information Technology industry.
  • Advocated data as a key IT asset and an integral part of Software Development Life Cycle for over a decade across numerous fortune 500 companies, government agencies, and international entities.
  • Experience in coordinating multi-discipline teams across functional organizations.
  • Proven expertise in data analysis, data warehouse, database management, business analysis, technical assessment large scale Data warehouse Applications including Data Transformation DT , Data Extraction/Conversion, User Interface, Integration Testing, Optical networking , Master Data Management MDM , Common Reference Data CRD , Data Governance and Business Rules Engine.
  • Project lead for successful implementation and transparent application conversion/migration projects to address standard and unique data administrative and application development needs.

TECHNICAL SKILLS

  • Metadata/BI Tools : AllFusion Meta Data Repository, Micro Strategy 9, Crystal Reports 2008
  • ETL Tools : DataStage 7.5, Informatica Power Center 8.6/8.1,Informatica Data Quality 8.6.1
  • Data Modelling Tools : Erwin 7.3, Power designer
  • Languages : ISPF/TSO, JCL, CLIST, FTP, C, C , JAVA, MSOffice Packages, Visio, SQL,
  • PL/SQL, T-SQL.
  • O/S : Windows 7, XP Professional, Windows Server 2000/2003/2008, UNIX,
  • SUN Solaris 10, Linux.
  • SQL Tools : TOAD, SQL Server Management Studio, SQL Navigator,
  • Oracle SQL Developer, SQL Plus.
  • Version control : Visual Source Safe, Rational Clear Case
  • Software Design Principles : Agile Development, Waterfall Life-Cycle
  • Databases : MS Access, DB2, UDB, Sybase 11, MS SQL Oracle7.0/8.0/9 i, SQL, PL/SQL,
  • DB2, SQL Server 6.5/2000
  • Professional Technical Training
  • Erwin Data Modeler, Erwin Model Navigator Training, Social Security Administration
  • Data Warehousing Training, United Health care
  • Advance SQL Development Training, Ramco System Limited
  • PUBLIC TRUST CLEAREANCE - Social Security Administration

PROFESSIONAL EXPEREINCE

Confidential

Sr Data Analyst/Data Mapper

Frontier acquired AT T Wirelan Business for state of Connecticut. During this process of acquisition and operations frontier wanted to migrate and convert relevant Customer Care and Billing CC B data from AT T's billing, customer care, trouble ticketing, service order, and inside plant application systems to the corresponding support system within the Frontier application environment for residential, business, and wholesale customers. As a part of this project i was involved as a Sr data analyst and Data warehouse analyst in my full capacity.

Responsibilities Activities:

  • Performed the Data Accuracy, Data Analysis, Data Quality checks before and after loading the data.
  • Developed SQL Joins, SQL queries, tuned SQL, views, test tables, scripts in development environment.
  • Used SQL Loader to load data from external system and developed PL/SQL APIs to dump the data from staging tables into base tables.
  • Source to Target Application Data Mapping based on the business rules.
  • Data Sanity and quantitative analysis on target system.
  • Data movement from Target to ODS Oracle - Operational data store and finally move to Teradata on which Enterprise datawarehouse is based.
  • Designed and developed dataflow architecture for the building score card, statistical report using Enterprise Data warehouse.
  • Designed Developed BTEQ scripts to load data from the operations data store ODS to the Enterprise data warehouse.
  • Monitored ETL Jobs through informatica and BTEQ logs and was effectively involved in trouble shooting issues.
  • Created test plan and test strategy to regression test the data warehouse environment and perform Mock conversion testing to validate the business scenarios by pulling the reports through tools like cognos and Brio.
  • Project Plans and Project Management Office PMO Reporting
  • Prepared developer test cases to verify APIs written for above mentioned data flow architecture.
  • Support of Stress and Volume Testing where applicable to ensure target environments can support target volume
  • Co ordinated Shadow environment development and testing activities by working with the infrastructure team Oracle and Teradata DBA , Data Architects ,Business analysts and Subject Matter Experts.
  • Managed and supported Hour-by-hour plans for all mock conversions and actual production cutover

Environment:

Oracle 11g, Teradata14.0, Oracle SQL Loader, Informatica 9.5.1 ,Informatica Data Profiler, Informatica Data Quality,SeapineTestTrackClient 2008.1, QTP 11, Data Warehouse, OLAP, SQL Navigator, SQL Developer, XML, OLTP

Confidential

Sr Data Analyst/Data Mapper

  • Project 1: The PG-Quality Indicator Project QI Project is a clinical, outcomes-based research project that allows health care industry participants improve their performance at their facility. QI Project's solutions are all housed in the web-based Data Center and single vantage point for managing every aspect of the performance measurement and assessment process--from data collection on core measures or non-core measures to data management to transmission to analysis. Their products helps clients comply with the complexities and understand the nuances of quality measures reporting requirements with sophisticated data collection solutions and analysis tools.
  • Project 2: ICD 9 to ICD 10 Conversion: The project was to migrate all PG-Quality Indicator products functionality and convert data from a mainframe-based system to an open systems environment Up-gradation of ICD 9-CM Clinical modification to ICD-10-CM/PCS Clinical modification/procedure coding system simultaneously
  • PG-Quality Indicator Inpatient, PG-Quality Indicator Outpatient, PG-Quality Indicator Behaviour Modules are currently in process of converting into ICD-10 new medical diagnosis coding system. Any code requests that do not meet the criteria from Join commission will be evaluated for implementation within ICD-10 on and after October 1, 2014

Responsibilities Activities:

  • Processed large Excel source data feeds for Global Function Allocations and loaded the CSV files into DB2 Database with.
  • Validate data in development and production environment are in sync.
  • Document functional specifications, conversions, upgrades, interfaces, reports, forms, and workflow.
  • Created the Hem Clinical Data dictionary of the various facilities.
  • Understand Healthcare domain specific to core measure analysis and reporting.
  • Developed mapping documents for ICD 9 to ICD 10 application.
  • Participate in sessions to map for ICD 9 to ICD 10 codes for each conversion
  • Through data analysis and mapping, created specifications to reflect how every field of information was to be converted
  • Analysed all ICD10 codes and validated the outcomes
  • Participated in and/or facilitated iterative mocks internally and externally with clients to validate data, then refine mapping requirements and ensure accuracy and quality
  • Analyzed and validated technical requirement for HIPAA 5010 and gathered technical requirement for ICD10.
  • Worked with ICD-10 Code Translator tool to ensure that all the ICD-9 codes are converted into the ICD-10-CM codes.
  • Attended CMS teleconference on ICD-10 Federal mandate for Providers
  • Created gap analysis about ICD 9 to 10 conversion process and also involved in data mapping for conversion.
  • Gained understanding of ICD-9 versus new ICD-10 code sets. Studied and analyzed the ICD conversion information provided by the CMS Centres for Medicare Medicaid Services
  • ICD-9 to ICD-10 Conversion: Performed Impact analysis to determine systems impacted by ICD 9 to ICD-10 Conversion.
  • Review process design and stored procedures. Ensure compliance to standards.
  • Involved in Data Quality and Informatica Data Profiler for profiling.
  • Worked with Informatica Data Quality 8.6.1 IDQ toolkit, Analysis, data cleansing, data matching, data conversion and reporting and monitoring capabilities of IDQ 8.6.1.
  • Was involved in testing OLAP cubes, multiple web based applications across different dimensions which allows user to analyze the core measures data.
  • Expertise in building tables, writing SQL statements/queries, stored procedures and views using company defined best practices for security and efficiency.
  • Created Relational Data models through ERWIN and have set the enterprise modeling standards
  • Developed the complete batch processes module by writing complex DB2 stored procedures, functions and triggers.
  • Developed the best practices and standards for Data Governance Processes.
  • Created source to target mappings for multiple source from SQL server to DB2. This was used by ETL developers.
  • Performed the Data Accuracy, Data Analysis, Data Quality checks before and after loading the data.
  • Developed SQL Joins, SQL queries, tuned SQL, views, test tables, scripts in development environment.
  • Used SQL Loader to load data from external system and developed PL/SQL programs to dump the data from staging tables into base tables.
  • Extensively wrote SQL Queries Sub queries, correlated sub queries and Join conditions for Data Accuracy, Data Analysis and Data Extraction needs.
  • Developed the E-R Diagrams for the logical Database Model, created the physical Data Model with Erwin data modeler.
  • SQL process/results evaluations and/or problem investigation
  • Responsible for routine and/or scheduled manual database import/update processes from external sources to support comparative reports

Environment:

DB2 on z/OS , Erwin Data Modeler r7.3, Erwin Model Navigator r7.3, Informatica Data Profiler, Informatica Data Quality,SeapineTestTrackClient 2008.1, QTP 11, Data Warehouse, OLAP, SQL Navigator, SQL Developer, Erwin 4.0, XML, OLTP

Confidential

Sr Data Analyst/ Data Mapper Client: Social Security Administration

  • Project 1: T2 TROR Treasury Report on Receivables project is to support the Social Security Administration's Account for SSA's debt portfolio according to Department of the Treasury Treasury and Office of Management and Budget OMB government-wide TROR reporting.
  • T2 TROR Treasury Report on Receivables project is to support improving debt management by accounting for and
  • reporting SSA's progress in resolving overpayments, including collecting delinquent and written-off debt, based on
  • Treasury and OMB reporting requirements, and providing information to support OMB debt management metrics and improve debt collection. These were based on the results of existing collection methods. This effort will also support SSA's strategic goal to ensure superior stewardship of its programs and resources.
  • Project 2: Recovery and Collection of Overpayments RECOOP system was developed in 1994 to facilitate the billing and collection of debts for Title XVI and Title II on the ROAR and SSR Master Files. The processing platform for RECOOP is CICS and the database management system is IDMS. RECOOP IDMS database currently contains records for more than 12 million debtors. The IDMS database technology is old and support for this type of database is diminishing. This project aligns with SSA's strategic vision for agency wide database management system upgrading which includes moving to a DB2 database management system.
  • Project 3 : T2 Modernization Remittance was developed for converting the RIC E and RIC T VSAM records into more efficient DB2 database tables is to fulfill SSA's strategic vision for an agency-wide database management system, which includes converting systems to DB2 databases. The conversion will ensure the data meets Systems standards regarding efficiency and accuracy. The changes will also cause the processing to be more effective and efficient with no noticeable functionality changes to the customer.

Responsibilities Activities:

  • Collaborated with peers in both business and technical areas, to deliver optimal business process solutions in SharePoint.
  • Settle on the methodologies and procedures for carrying out effective data analysis.
  • Prepared the detailed TROR Technical Requirements documents and technical data mapping of business process rules and provide the support for the all the phases of the System Development Life Cycle.
  • Worked with Project sponsors, Subject matter experts, Data Stewards, Business Analysts and developers to capture the requirements that can serve as foundation for developing the technical documentation
  • Interacted with data modeler, LDBA's, Data Analysts users to discuss the Table structures required in the application.
  • Map Recoop legacy data fields to DB2 database and clean the legacy data based on the business requirement.
  • Through data analysis and mapping, created specifications to reflect how every field of information was to be converted.
  • Performs manual data audits on the target system.
  • Identify data anomalies for identified data sources and review results with users and data stewards
  • Performed the Data Accuracy, Data Analysis, Data Quality checks before and after loading the data.
  • Prepared Custom SQL queries using for testing the application as per the Technical specifications.
  • Created data set reports for users by writing SQL queries which then were used for UAT
  • Participated in and/or facilitated iterative mocks internally and externally with clients to validate data, then refine mapping requirements and ensure accuracy and quality
  • Gathered data necessary to accomplish analysis from a number of sources, accumulate it together in prearranged format and enter the data in several data analysis software.
  • Participates in all phases to convert the RECOOP IDMS database to a SQL Server 2008 database.
  • Improve standardization of data.
  • Convert all the existing Validation RECOOP data in the IDMS database so that the data can be stored properly in the SQL Server database

Environment:

Erwin Data Modeler r7.3, Erwin Model Navigator r7.3, CA Unicenter platinum tools , CA AllFusion data Repository , COBOL, JCL, DB2 on z/OS, ENDEVOR, CONTROL-M, , XPEDITER, UTILITIES LIKE SPUFI, SQL, HP ALM 11.0 QTP 10.0, QMF, SQL Server 2008, Sybase 10 on Sun ,Oracle 11g.

Confidential

Sr Data Analyst/Data Mapper

Responsibilities Activities:

  • This is a Jakarta Struts based portal, which helps the users registered with Client to obtain the information pertaining to their Health conditions, and Health care plans online. Since many of the queries that were to the Front office of Client can now be obtained from the portal itself, it helps Client to provide better services to its clients. The main ideas behind this portal are Claims information to be made readily available to the users, Information regarding the health conditions readily available to the users. Online information about the Health care plans opted by the users. Online request for ID cards, downloading claim form, customized claim forms for employers are among the many services that are provided by myUHC, to reduce the number of calls that are made to UHG for Health Plan related queries. Also worked with ICD 9 to ICD 10 Conversion efforts.
  • Processed large Excel source data feeds for Global Function Allocations and loaded the CSV files into Oracle Database with SQL Loader utility.
  • ICD-10 transition with evolving healthcare environment of accountable care.
  • CMS 'ICD-10 Conversion activities in preparation of implementation.
  • Affordable Care Act ACA : Knowledge of conversion projects concerning protocols/standards in significant areas of ICD 9/ICD 10, HIPAA 4010/5010 formats, and EDI Electronic Data Interchange Codes.
  • ICD-9 to ICD-10 Conversion: Performed Impact analysis to determine systems impacted by ICD 9 to ICD-10 Conversion
  • Participated in and/or facilitated iterative mocks internally and externally with clients to validate data, then refine mapping requirements and ensure accuracy and quality
  • Performed code Inspection and moved the code into Production Release.
  • Developed the best practices and standards for Data Governance Processes.
  • Performed Data filtering, Dissemination activities, trouble shooting of database activities, diagnosed the bugs and logged them in version control tool.
  • Created source to target mappings for multiple source from SQL server to oracle. This was used by ETL developers.
  • Performed the Data Accuracy, Data Analysis, Data Quality checks before and after loading the data.
  • Through data analysis and mapping, created specifications to reflect how every field of information was to be converted
  • Involved in Data Quality and Informatica Data Profiler for profiling.
  • Coordinated with the Business Analyst Team for requirement gathering and Allocation process Methodology, designed the filters for processing the Data.
  • Designed and developed the Database objects Tables, Materialized Views, Stored procedures, Indexes , SQL statements for executing the Allocation Methodology and creating the OP table, CSV, Text files for business.
  • Performed the physical database design, normalised the tables, worked with Denormalised tables to load the data into fact tables of Data warehouse.
  • Used SQL Loader to load data from external system and developed PL/SQL programs to dump the data from staging tables into base tables.
  • Participated in and/or facilitated iterative mocks internally and externally with clients to validate data, then refine mapping requirements and ensure accuracy and quality

Environment:

DB2 on z/OS, Data Warehouse, OLAP, Unix sh Shell, SQL Navigator, SQL Developer, Erwin 4.0, Informatica Data Profiler, Informatica Data Quality,OLTP, MS-Excel 2000, MS-office 2000, Microsoft XP Professional.

Confidential

Sr Data Analyst Client: Social Security Administration

Enterprise Metadata Repository / Supported Global Reference Table Portal for z/OS, Which is a powerful data management tool that enables organizations to identify, understand, coordinate, and effectively use SSA Social Security Administration Agency wide information assets. Based on open, non proprietary architecture and popular relational databases, the Repository supports business and technical metadata for data warehousing, data administration, and application development efforts.

Responsibilities Activities:

  • Participates in all phases of the programmatic data repository development life cycle, with emphasis on design, development/programming, documentation, testing and implementation.
  • Identify data anomalies for identified data sources and review results with users and data stewards
  • Created source to target mappings with transformation rules
  • Wrote SQL Queries for Data Accuracy, Data Analysis and Data Extraction needs.
  • Created requirements for data modelers like minimum, maximum values, acceptable range, null acceptance etc
  • Design and prepare the Logical/Physical data models for Global Reference Table Portal using the CA Erwin data Modeler as per the special needs of the SSA Agency.
  • Provided the support for automated and manual procedures for metadata creation, retrieval from the Enterprise data Repository and delivered to the Application Teams.
  • Design and prepared the Physical data models using CA Erwin data Modeler for supporting the Relational data base systems and customizing the Repository Meta models as per the special needs of the SSA Agency
  • Provide support to streamline application development , data dictionary , data maps ,data artifacts and assists with coordinating the data standards across the development teams
  • Design and develop backup/recovery Strategy for the metadata repository.
  • Responsible with ETL design identifying the source systems, designing source to target relationships, data cleansing, data quality, creating source specifications, ETL design documents , ETL development following Velocity best practices .
  • Analysed the Use cases and developed the Test cases and helped in the preparation of master test plan.
  • Analyze the current data movement ETL process and procedures. Identify and assess external data sources as well as internal and external data interfaces.
  • Provided the support for Development of data requirements for the Medicare and disability applications and worked with latest release of enclosed Programmatic Systems. Medicare Part A/B and IRMAA PT2T18

Environment:

Erwin Data Modeler r7.3, Erwin Model Navigator r7.3, CA Unicenter platinum tools , Unix sh Shell, CA AllFusion Meta data Repository , COBOL, JCL, DB2 on z/OS, ENDEVOR, CONTROL-M, UTILITIES LIKE SPUFI, SQL, QMF, HP ALM 11.0 QTP 10.0, SQL Server 2008,Sybase 11,Oracle 11g,Windows 7/XP/2003.

Confidential

Sr Data Analyst

Supported FTLOS Fifth and Third Loan Organization System , which is a web based application used by banking and Mortgage service groups of Fifth and Third Bank.

Responsibilities Activities:

  • Participating in project reviews consulting on project designs.
  • Participating data architect sessions to define table structure required in the application.
  • Designing and creating tables and triggers in DB2.
  • Fine-tuned the system by creating relevant indexes and putting proper checks in the existing front end applications.
  • Writing code for the Core batch processes in Commercial Loan Organization System.
  • Developing External Interfaces, which integrates FTLOS system with third party systems
  • Supporting team members in writing Complex Stored Procedures and other trouble shooting during the development process.
  • Monitor database performance, like buffer pool hit ratio, package hit ratio, catalog hit ratios.

Environment:

Erwin Data Modeler r7.0, Erwin Model Navigator r7.0, DB2 UDB 9.1, Unix sh Shell ,CA Data Repository, Crystal Reports 9.0, HTML, JavaScript, IIS 6.0

Confidential

Data Analyst

This is a versatile system on the concept of e-governance, being built for managing Customers, Revenue, Resources, Readings etc across various eThekwini departments like Water, Electricity, DSW, Readings etc. The system additionally has the flexibility where user has the option to configure/define departments, services etc. The system takes care of the complete cycle from customer requesting a service to the charges raised against the service. The charge raised becomes an element of the monthly-generated bill for the particular customer. It also handles like generating a single integrated bill for every customer against the utilization of their Water, Electivity, DSW etc facilities of council. This is done on a monthly basis.

Responsibilities Activities:

  • Understanding the functionality of core business processes.
  • Implemented coding, procedural, and change control guidelines for application development in the proprietary environments.
  • Involved in Data Quality, and profiling from source systems.
  • Responsible with ETL design identifying the source systems, designing source to target relationships, data cleansing, data quality, creating source specifications, ETL design documents , ETL development following Velocity best practices .
  • Involved in Data Extraction, Transformation and Loading ETL from source systems
  • Designed a reporting table approach to get better performance of complex queries and Views
  • The data received from Legacy Systems of customer information were cleansed and then Transformed into staging tables and target tables in DB2
  • Used External Tables to Transform and load data from Legacy systems into Target tables
  • Incremental loading of Fact table from the source system to Staging Table done on daily basis
  • Design and implement the proof of concept on development environment.
  • Expertise in SQL tuning, DB/DBM parameter tuning, different levels of memory allocations, various tools like export, import, load, autoloader and experience with all kinds of database snapshots.
  • Day-to-day attending to the user queries in sorting out database problems
  • Coordinating with user departments and attending their requirements for the smooth functioning.

Environment:

DB2 8.1, J2EE, Linux, IBM Web sphere 6.0, Erwin, SQL Server 2000/2005, Crystal Reports 9.0, HTML, JavaScript, IIS 6.0, Data Stage Version 7.0

Confidential

Sql Programer

Oregon Research Institute is implementing a comprehensive and integrated Human Resources Management System HRMS , which includes Payroll, Self Service and Learning Management software covering all the departments and Finance and general Ledger. The proposed system provides a single and integrated view of employee information across all departments in Oregon Research Institute.

Responsibilities Activities:

  • Provided Sql support for customized web based HRMS solution and Finance and general Ledger Components.
  • Involved in writing back-end Stored Procedures
  • Provided Power designer Data Modeler Reports for the Application Teams.
  • Used utilities to backup and recover user data and DB2 catalog. Created databases and other database objects. Worked with export, import, load utilities to manage table data. Prepared and implemented standard recovery procedures for data recovery. Used RUNSTATS, REORGCHK and REORG to enhance application performance. Used DB2 access control mechanism to implement security within the database. Involved in installing DB2 on AIX, creating Instances and databases including DBM configuration on different servers and providing connectivity to development, test, staging and production databases. Recovered data to a prior point in time. Created Database Objects Tables, Table space, Indexes and Storage group. Involved in developing database application objects like stored procedures, functions, triggers and user defined data types.
  • Interacted with Payroll IT staff for requirements gathering and analyzing the requirements.
  • Involved in developing Pay Bill, Pay Slip, Payment Register, Service Register, Vacancy List, View Retirement due List, Employee Salary history, History of Leave adjustment, Form 16, Form 24 and Form 12BA and other reports.

Environment:

SQL Server 2000, DB2 UDB 8.1, Power Designer, HTML, IIS 6.0, Visual Studio.Net, VSS, JavaScript, SQL Server 2000, Crystal Reports 8.5, Platinum Erwin 3.5.2 Windows 2003 and Suse Linux

We'd love your feedback!