Ab Initio Lead / Principal Data Analyst Resume
Northbrook, IL
SUMMARY:
- Nearly 9 years of systems development experience, ETL with specialized experience in large Data warehouse systems
- Over 7 years of experience in Ab Initio along with 7 years of experience in RDBMS like Teradata, DB2, Oracle and Unix Shell Scripting.
- Over 3+ years of experience in Data Analysis
- Nearly 1 year of experience on Meta data management, AWS.
- Sound knowledge of database architecture for OLTP and OLAP applications, Data Analysis, ETL methodologies technologies using SSIS and Ab Initio processes in developing data marts and Enterprise Data Warehouse.
- A proactive approach to problem solving, taking ownership of issues & having the determination to follow things through.
- Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
- Good hands in creating High level Design, Low level Design, Data Mappings, Use case mappings, Unit Test case, and Production implementation plan documents.
- Experience on data load/unload utilities like BTEQ (Import/Export), Fast Load, Multi Load, T - Pump, Fast Export, API, UTILITY mode.
- Vast experience in work environment with a mix of onsite-offshore Global Delivery model.
- Good knowledge on multiple scheduling tools CTL-M, Maestro, Autosys and Ops Console.
- Had been responsible for mission critical high availability (near 24X7) batch gateway using Ab Initio, Oracle, UNIX Ksh scripting, Tivoli TWS, Maestro scheduling, SQL.
- Extensive experience in customizing Metadata Hub portal home page topics, Object views, Tabular views, reports and entities. Created various imports to load the metadata from various SOR’s to Metadata Hub.
- Strong PL/SQL and SQL programming skills in database develop life cycle.
TECHNICAL SKILLS:
Methodologies: Ralph Kimball, Object Oriented Design and Analysis, STAR & SNOWFLAKE modeling, Agile
ETL Tools: Ab-Initio GDE 1.13/CO>OS 2.13, GDE 1.15//CO>OS 2.15, GDE 3.1.1//CO>OS 3.1.4.5, GDE 3.1.7//CO>OS 3.1.6.3, GDE 3.2.2//CO>OS 3.2.5.1, Informatica Power Center 6.x, Informatica PowerMart 5.x.
BI Tools: Micro strategy, Business Objects 6.0/5.x (Designer, Supervisor, WEBI 6.1), Cognos.
Database & Tools: Oracle 10G/9i/8i/7.x, Teradata V2R12/V2R13, MY-SQL 5.7 and SQL Server 2008, Erwin 9.64
Application Servers: Oracle Application server, BEA Web logic 7.0, IBM Web sphere, IIS.
GUI: VisualBasic5.0/6.0, Developer 2000
Software/Tools: Java, J2EE (JSP, Servlets, EJB), Java- RMI, JDBC, JavaScript, VBScript, C, C ++, SQL, PL/SQL, ASP 2.0,HTML, XML
Operating Systems: SUN Solaris 8x, HP 11i, Linux, Windows XP/2000/NT, AWS
PROFESSIONAL EXPERIENCE:
Confidential, Northbrook, IL
Ab Initio Lead / Principal Data Analyst
Responsibilities:
- Closely working with the Subject Matter Experts (SME’s) for the Analysis, Design on Functional documents.
- Coordinating with source system owners.
- Involved in full life cycle for the project, Design, data Modelling, Requirements gathering
- Good Hands on Agile metholodigies.
- Collaborate with data architects for data model management and version control.
- Worked on snow flake schema data modelling on the data mart based on inputs from Architect and Business Analyst.
- Conduct data model reviews with project team members.
- Capture technical metadata through data modeling tools.
- Ensure data warehouse and data mart designs efficiently support BI and end user
- Worked with AWS Data services including S3, Data Pipeline, RDS, EC2, and others.
- Produced the required documents like High Level Design, Mapping Documents and Use Case Documents.
- Involved in Design, Development, testing and implementation using ETL AB initio.
- Created generic graphs to read multiple source Data from Hadoop Data Lake.
- Providing oversight and technical guidance for developers on Ab Initio.
- Lead the team with possible solutions in critical situation.
- Developing UNIX Shell scripts to automate repetitive database processes and maintained shell scripts for data conversion
- Involving in Project requirement discussions and interacting with other teams in support and release management activities.
- Used SQL Developer for data profiling and analyzing the data for data quality check.
Environment: Ab Initio (GDE 3.2.2, Co>OS 3.2.5.1), Erwin R 9.64, SQL Developer, Oracle 10G, Linux, AWS S3, AWS Aurora, AWS EC2, Control-M
Confidential, Atlanta, Georgia
Principal Data Analyst
Responsibilities:
- Reviewing data lineage and data controls end to end for 15 CCAR Submissions to Fed and document business process for data flows/controls.
- Coordinating with source system owners.
- Reverse engineered the CCAR Submissions technical mappings to Source systems.
- Converted the technical mapping documents to Business definitions of all CCAR reports.
- Converted technical mappings to Enterprise level Metadata hub.
- Used data profile techniques to identify missing Business rules.
- Working as on site co-coordinator and guiding offshore in right manner to complete the assigned tasks.
Environment: Ab Initio (GDE 3.2.2, Co>OS 3.2.5.1, Linux
Confidential, Durham, CA
Ab Initio Lead / Principal Data Analyst
Responsibilities:
- Coordinating with source system owners, day-to-day ETL progress monitoring, Data warehouse target schema Design (Star Schema) and maintenance.
- Responsible for translating technical design documents into detail design documents using skill sets like Ab Initio graph detail design/development/unit testing.
- Hands on experience with Meta data management (MDM) to build enterprise level Database.
- Designed Ab initio mappings by translating the business requirements.
- Lead the team with possible solutions in critical situation.
- Provided oversight and technical guidance for developers on Ab Initio.
- Work with the PMO team to provide regular update and adjust the time lines if necessary.
- Developed Tasks to integrate MDM into enterprise data.
- Used MDM process to create customer id’s across enterprise process.
- Experience working with Ab Initio Metadata Hub, administered in upgrading the data store and maintenance of the portal.
- Developed mappings for customers, Investments and Risk analysis.
- Developed reusable Transformations.
- Widely used Partition components depending volume of data,, Filter by Expression, Sort, Reformat, Gather, Merge, Redefine, Replicate, Scan, Rollup, Join, Sort, Dedup sort, Normalize and all type of dataset components to develop the ETL transformation logic.
- Design, configuration of Ab intio web services to automate the eID requests using web services consumer transformation.
- Assisted in adding Physical conceptual data model using Erwin 9.0.
- Analyzed business process workflows and assisted in the development of ETL procedures for moving data from source to target systems.
- Resolved Inconsistent and Duplicate Data to Support Strategic Goals with Multi domain MDM
- Assisted the team in the development of design standards and codes for effective ETL procedure development and implementation.
- Created CONDUCT-IT Plans to combine multiple graphs.
- Designed job flow using CTL-M scheduling tool to meet all the SLA requirements.
- Worked on loading of data from several flat files sources to staging area using Teradata MLOAD, FLOAD and FASTEXPORT.
- Involved in the design, development and testing of the BTEQ, packages for the ETL processes.
- Created summarized tables, control tables, staging tables to improve the system performance and as a source for immediate recovery of Teradata database.
- Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues
- Developed UNIX Shell scripts to automate repetitive database processes and maintained shell scripts for data conversion.
- Good experience on working as on site co-coordinator and guiding offshore in right manner to complete the assigned tasks.
Environment: Ab Initio (GDE 3.1.7, Co>OS 3.1.6.3), CONDUCT>IT, Teradata, AIX, KSH Scripting, Control-M.
Confidential, Columbus, OH
Sr. Ab Initio Developer/ Principal Data Analyst
Responsibilities:
- Involved in Gathering the requirements for technical design, mappings and development.
- Involved in data analysis to identify the end to end flow issues.
- Closely worked with the SME’s for the Analysis, Design on Functional documents.
- Extensively used Partition by Key, Partition by Expression/ Round Robin, Filter by Expression, Sort, Reformat, Gather, Merge, Redefine, Replicate, Scan, Rollup, Join, Dedupsort, Normalize, Leading records and all type of dataset components to develop the ETL transformation logic.
- Used Call Web service component to interact with the Services.
- Used CURL application to upload data into Java services.
- Good hands on processing mainframe (MVS) files.
- Created the automated jobs using Operational console.
- Created the reports on the mainframe for source data validations.
- Used DB2 load utilities to load the data from different sources like MVS, UNIX files.
- Created the validation graphs to validate the source data.
- Created generic graphs to generate external extracts for different users.
- Creating Graph level and Project level parameters according to the requirements or psets.
- Used Web service components to interact with the Java services.
- Created Generic Graphs to interact with multiple methods of Java Services.
- Worked on generating the PSETS with the generic extracts.
- Created 90+ psets using generic process to support the different requirements.
- Created Different DBC files depending on the Environment.
- Created the Common Project parameters to use the parameters at the enterprise level.
- Involved in writing the SQL Statements to validate the data in the Tables as part of Unit, System and Integration testing.
- Created Reports for AML Process, which is used for audit on compliance.
- Created Multiple XFR’s to support generic extract.
- Used PDL and Shell interpretations to resolve the graph level parameters.
- Created CONDUCT-IT Plans to combine multiple graphs and support to Op Console jobs.
- Created test cases and data validation for Unit, System and Integration testing.
- Deeply involved to support QA/UAT testing.
- Provided Production Support for the initial release of the project. Worked on production support in monitoring the jobs and took corrective action based on the job status.
Environment: Ab Initio (GDE 3.1.1, Co>OS 3.1.4.5), CONDUCT>IT, Ops-Console, MVS, Linux, KSH Scripting, Distributed DB2, Oracle.
Confidential, GA
Ab Initio Developer
Responsibilities:
- Involved In different modules of Enterprise data warehousing, Supply chain and .com projects
- Involved as a Business Analyst interacted with product and Technical Managers of the Source systems and came with Business transformation logic.
- Involved in Design, Development, Testing, Documentation and Implementation of the project.
- Involved in full life cycle for the various projects, Design, data Modelling, Requirements gathering, Unit/QA testing and production.
- Onsite co-coordinator - offshore team communication.
- Lead the team to complete the assigned Modules.
- Designed and developed mapping, transformation logic and processes in Abinitio for implementing business rules and standardization of source data from multiple systems into the data warehouse.
- Designed and Created complex graphs using Abinitio.
- Extensively used Partition by Key & Sort, Partition by Expression/ Round Robin, Filter by Expression, Sort, Reformat, Gather, Merge, Redefine, Replicate, Scan, Rollup, Join, Denormalize, Sort, Dedupsort, Normalize and all type of dataset components to develop the ETL transformation logic.
- Design Parallel Partitioned Ab Initio graphs using GDE Components for high volume data warehouse
- Worked with continuous components and XML components.
- Testing the graphs for performance. Performance improvement measures included the use of the Multi file system, Lookups, Max-core, In-Memory Joins and parallelism techniques for maximum usage of cache memory.
- Write and modify several application specific Config scripts in UNIX in order to pass the Environment variables.
- Generate SQL queries for data verification and backend testing.
- Involved in writing several Teradata BTEQ scripts to implement the business logic.
- Hands on experience with Teradata Sql Assistant to interface with the Teradata.
- Writing UNIX Wrapper Scripts for AbInitio graphs.
- Involved with Unit testing, Functional Testing teams & Performance Testing teams.
- Designed/Developed scripts to move data from the Staging tables to the Target tables.
- Analyze the source and target record formats and made necessary changes
- Investigated the cause for the load rejects and resolved them.
- Worked on production support in monitoring the jobs and took corrective action based on the job status.
- Worked extensively in UNIX environment creating Korn shell scripts in order to support Abinitio Graphs.
Environment: Ab Initio (GDE 1.15.1, Co>OS 2.15.115), Plan>IT, UNIX, KSH Scripting, DB2 9.5, Redhat Linux, Maestro Scheduler, Teradta V2R13.1.
Confidential, VA
SQL Server Developer
Responsibilities:
- Designed ETL (Extract, Transform, Load) strategy to transfer data from source to stage and stage to target tables in the data warehouse tables and OLAP database from heterogeneous databases using SSIS and DTS (Data Transformation Service).
- Performed the ongoing delivery, migrating client mini-data warehouses or functional data-marts from different environments to MS SQL server.
- Deploy SSIS packages and SSRS Reports to different environments.
- Developed complex SQL queries to verify data from Source systems to Target Systems
- Develop user test cases in coordination with business analysts and business users.
- Created SSIS packages to capture the daily maintenance plan scheduled jobs status (success failure) with a status report daily.
Environment: SQL Server 2000/2005 Enterprise, SSIS, SSRS, SSAS, Windows Enterprise Server 2003, Microsoft Visual Studio, MS Access, CSV, MS Excel, Load Runner, TSQL.
Confidential, GA
Ab Initio Developer
Responsibilities:
- Creating Graph level and Project level parameters according to the requirements.
- Graphs are generated using Ab Initio components like Reformat, Join, Redefine Format, Sort, De Normalize, Input Table, Output Table, Update Table, Gather Logs, Run SQL, Write XML, Filter By Expression, Write Multiple Files and Custom components(FDQ Component) .
- Created continuous flows to crate purchase orders, load into DB2 tables.
- Extensively used continuous flow components.
- Wrote complex SQL queries including inline queries and sub queries for faster data retrieval from multiple tables.
- Developed number of AbInitio Graphs based on business requirements using various AbInitio Components such as Partition by Key, Partition by round robin, reformat, rollup, join, scan, normalize, gather, Broadcast, merge etc.
- Worked with continuous flow components and XML components.
- Good hands on migrating codes to QA and Production.
- Interacted with all teams like Business Analyst, Development Team, Testing Team, Configuration Management Team and Project Management.
Environment: Ab Initio (GDE 1.15.1, Co>OS 2.15.115), UNIX, KSH Scripting, DB2 9.5, AIX version 5.