We provide IT Staff Augmentation Services!

Sr. Etl Developer Resume Profile

2.00/5 (Submit Your Rating)

San Antonio, TX

PROFESSIONAL SUMMARY

  • Strong knowledge More than 9.5 years of IT Experience in design development, test and maintenance Business Intelligence and database applications using tools and technologies such as Informatica Power Center, Health care, Informatica Metadata Manager, SQL and Mainframe systems
  • Expertise in Financial Services, Insurance and Brokerage Domains.
  • Experience in the Development, Implementation of Database/Data Warehousing/Client/Server/Legacy applications for using Data Extraction, Data Transformation, Data Loading and Data Analysis.
  • Strong skills in Data Analysis, Data Requirement Analysis and Data Mapping for ETL processes.
  • Experience using Informatica Cloud Services, Metadata Manager, Power Exchange Sales force and SAP Connect, Reporting Services
  • Extensive experience in Informatica Power Center to implement data marts which involve creating, debugging and executing mappings, sessions, tasks, and workflows and testing SQL queries.
  • Actively involved in all stages of SDLC including requirements analysis, estimation, high level design, low level design, coding/testing/review, integration testing, installations, scheduling and production support.
  • Experience in integration of various data sources like Oracle, DB2, SQL Server, Flat Files and Mainframe into Warehouse.
  • Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
  • General understanding of Database Management System and Data Warehouse including their functional and technical architecture and the design of Data Flow Diagram.
  • Well versed in Data modeling concepts incorporating dimensional modeling star and snowflake schema , logical and physical data modeling.
  • Implementation and Support of various web based applications in OLTP, Data Warehousing, and OLAP application.
  • Extensively worked with Slowly Changing Dimensions Type I, Type II and Type III ...
  • Proficient in performance tuning of Informatica Mappings, Transformations, and Sessions.
  • Exposure to Teradata Fast Load, MultiLoad, T-pump, TPT utilities.
  • Worked extensively with Informatica Workflow Manager using tools such as Task Developer, Work let and Workflow Designer, and Workflow Monitor to build and run workflows
  • Developed UNIX shell scripts to FTP source files, validate source files, automated archival of Log files, through PUTTY.
  • Experience using CTRL M 7.0 for scheduling workflows.
  • Extensive knowledge and hands on experience in z/OS, OS/390, COBOL, JCL, VSAM, DB2, MQ Series and CICS.
  • Well versed with all Tools/Utilities like - File Aid, Abend-Aid, Intertest, DFSort/ICETOOL, CHANGEMAN, TSO/ISPF, IDCAMS, SPUFI, QMS, SQL and other IBM Utilities.
  • Managed the Metadata associated with the ETL processes used to populate the Data Warehouse.
  • Experienced in scheduling Sequence and Parallel jobs using DataStage Director and UNIX scripts.
  • Excellent communication and strong interpersonal skills with ability to interact with end-users, managers and technical personnel.
  • Experience in leading teams. Roles included Project Lead, System Analyst, Onsite Coordinator and Developer.
  • Extensive experience in Unit Testing, Functional Testing, System Testing, Integration Testing, Regression Testing and User Acceptance Testing UAT
  • Expertise in Technical documentation, Induction Manuals, User guides, Documents, Control flow diagrams Traceability Matrix, Weekly Monthly Status reports.
  • Working on both Water fall and agile methodology.

WORK EXPERIENCE

Confidential

Sr. ETL Developer

  • United Services Automobile Association USAA is a Fortune 500 financial services company offering banking, investing, and insurance to people and families that serve, or served, in the United States military.
  • USAA was founded in 1922 by a group of U.S. Army officers to self-insure each other when they were unable to secure auto insurance due to the perception that they were a high-risk group. USAA has since expanded to serve all members of the Armed Forces and all who served honorably in the US Armed Forces, as well as their families with property casualty insurance, banking, life insurance, investment and financial planning products and services.

Confidential

  • USAA is changing their Brokerage systems from P3 to NFS, so all USAA Enterprise Interfaces will have impact due to source system changes. P3 data stores in ORACLE DB and corresponding Pro C scripts will extract data from DB to create extracts for each EI. But, in future NF will start sending data in Mainframe PS files so USAA decided to replace existing Pro C with ETL jobs to extract data from NF files and send corresponding extracts to all EI interfaces. USAA has total 24 EI interfaces.

Confidential

  • The objective of 'BANK sandbox' is to build an Analytical data store DATA Mart by extracting from base area.
  • The scope of this project is to implement projects which would store the customers debit card and credit card details as well as collection status with debit and balance in it. This project will utilize the data from Bank and member card data sources and will operationalize the data load process with in BANK project environment. The project will provide high level reconciliation process to confirm accuracy of the data for modeling.
  • The BANK collections sandbox created for the purpose of simplifying the reporting needs of the Bank collections area.

Confidential

Description:

  • Credit card origination workflow do not always check that all FACTA alerts identified are sent to the bank's central FACTA repository Modify credit card origination and servicing functions to appropriately handle and clear FACTA alerts, including ability of tracking and reporting, as required by the Regulation. Provide the ability to store FACTA alert data obtained as part of the Monthly FICO credit score update process are stored within the Bank Central FACTA repository.

Confidential

Description:

  • The objective of 'RAD' is to build a stage area to load all the employees' performance appraisals. The scope of this project is to implement projects which would increase the performance of an employee by ranking them on various aspects. The project will utilize the data from HR systems and will operationalize the data load process.

Confidential

Description:

  • The objective of 'Auto Product Pricing Data' is to replace USAA's existing COVI application and to build an Analytic Data Store Data Mart by extracting data from SDS Staging Data Store . The scope of this project is to analyze the data in SDS with Auto Line of Business, model ADS and build cubes to support the reporting and analytic capabilities of Business Users. Auto Policy and claims data will be extracted from the SDS Data Warehouse and the business rules identified will be applied to make data meaningful. The project will retire the existing COVI application.

Confidential

Responsibilities:

  • Studied and understood the warehouses, sources, and functionally analyzed the application domains, involved in various knowledge transfers from dependent teams understand the business activities and application programs and document the understandings for internal team referencing.
  • Interacted with functional/end users to gather requirements of core reporting system to understand exceptional features users expecting with ETL and Reporting system and also to successfully implement business logic.
  • Study of detailed requirement of end users of system their expectations from Applications.
  • Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema.
  • Business process re-engineering to optimize the IT resource utilization.
  • Integration of various data sources like Oracle, SQL Server, Fixed Width and Delimited Flat Files, DB2, COBOL files XML Files.
  • Identify the flow of information, analyzing the existing systems, evaluating alternatives and choosing the most appropriate alternative.
  • Understand the components of a data quality plan. Make informed choices between sources data cleansing and target data cleansing.
  • Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
  • Transformed data from various sources like excel and text files in to reporting database to design most analytical reporting system.
  • Initiate the data modeling sessions, to design and build/append appropriate data mart models to support the reporting needs of applications.
  • Change Data Capture can do using the Power Exchange.
  • Used features like email notifications, scripts and variables for ETL process using Informatica Power Center.
  • Involved in Data Extraction from Oracle and Flat Files using SQL Loader Designed and developed mappings using Informatica Power Center
  • Developed slowly changed dimensions SCD Type 2 for loading data into Dimensions and Facts.
  • Involved in Data Extraction from Oracle and Flat Files, XML Files using SQL Loader, Freehand SQL.
  • Experience using Informatica Cloud Services, Metadata Manager, Power Exchange Sales force and SAP Connect, Reporting Services
  • Using Toad to increase User productivity and application code quality while providing an interactive community to support the user experience.
  • Created the Test cases and Captured Unit test Results.
  • Good experience in writing shell scripts in Unix Environment.
  • Extensively used ETL to load data from Flat files which involved both fixed width as well as Delimited files and also from the relational database, which was Oracle 9i/10g.
  • Developed and tested all the Informatica mappings, sessions and workflows - involving several Tasks.
  • Imported metadata from different sources such as Relational Databases, XML Sources and Impromptu Catalogs into Frame Work Manager
  • Involved Conducted and participated in process improvement discussions and recommending possible outcomes and focused on production application stability and enhancements.
  • Managed the Metadata associated with the ETL processes used to populate the Data Warehouse.
  • Responsible for design reviews with Architects.
  • Collect and link metadata from diverse sources, including relational databases Oracle, XML and flat files.
  • Responsible for internal reviews of the deliverables by the team members and with Tech Leads.
  • Monitored Workflows and Sessions
  • Developed Unit test cases for the jobs.
  • Responsible for attending status meeting with offshore team.
  • Recruit Manage the Offshore Development Team to track the productivity and deliverables.
  • Provide the team with technical leadership on ETL Design, Development best practices, Version Controlling and customization of Data loads.

Confidential

Sr. Mainframe developer

Responsibilities:

  • Interacting with Business Users and Business Analysts to gather requirements
  • Contribution related to Requirements Study, Analysis, Estimation, Design, Development, Testing and Implementation
  • Understanding the Business Requirements and analyze the system for enhancements and converting them into a Technical Design.
  • Design review with Architects
  • Coded programs and modified existed programs as per specification requirements in VS COBOL using DB2, VSAM files and Flat files.
  • Responsible for internal reviews of the deliverables by the team members.
  • Responsible for attending status meeting with onsite coordinators.
  • Unit testing, reviewing results preparing U.T.R's Unit Test Results
  • Peer reviewing the components.
  • Collect and link metadata from diverse sources, including relational databases Oracle, XML and flat files.
  • Code reviews with Tech Leads and Maintenance Team.
  • Support for Implementation, QA and User acceptance Testing.
  • Set the dependencies for new jobs and scheduling the jobs in CTL-M.
  • Resolve batch cycle issues and ABENDS during the batch run
  • Create exception and Adhoc jobs.
  • Writing SQL queries
  • Support System testing and Release testing.
  • Implementation and warranty support.
  • Prepared induction Manuals, Flow charts of Batch/Online systems.
  • Preparing wisdom solutions for changes implemented.
  • Handled multiple projects at the same time.
  • Worked on REXX tools to improve quality of deliverables and in impact analysis.

We'd love your feedback!