Etl Developer Resume
Dayton, OH
SUMMARY:
- 7+ years of IT experience in Software Analysis, Development, manual testing on web based and Implementation of business applications for Financial, Mortgage Lending, Insurance, Healthcare, Telecom and sales.
- 5+ years of ETL and data integration experience in developing ETL mappings and scripts using Informatica Power Center 9.1/8.6.1/8.5/8.1/7.1/6.2, Power Mart 6.2/5.1, Informatica Data Quality (IDQ) and Informatica Data Analyst (IDA) tools using Designer (Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, and Transformation Developer), Repository Manager, Workflow Manager & Workflow Monitor.
- 4+ years’ experience using Oracle 11g/ 9i/8i/7.x, MS SQL Server 2005/2000, Teradata V2R5/V2R4, MS Access 7.0/2000, SQL, PL/SQL, SQL*Plus, Sun Solaris 2.x
- Extensive experience in Informatica cloud services and creation and maintenance of database objects like tables, views, materialized views, indexes, constraints, primary keys, sequence, synonyms and database Link
- Utilized of Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing duplicate data.
- Extensive experience in development and modification of PL/SQL Scripts, Anonymous Blocks, Triggers, Procedures, Functions, Packages and Triggers using PL/SQL.
- Created ETL test data for all ETL mapping rules to test the functionality of the Informatica graphs.
- Good at Data Warehouse techniques - Dimensional data Modeling, Star Schema and Snowflake Schema
- Experienced in creating Transformations and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
- Experienced in Performance tuning of targets, sources, mappings and sessions.
- Extensively worked on Informatica Designer Components - Source Analyzer, Warehousing Designer, Transformations Developer, Mapplet Metadata and Mapping Designer.
- Involved in the creation of new objects (Tables/Views, Triggers, Indexes, Keys) in Teradata and modified the existing 700+ ETL's to point to the appropriate environment.
- Strong Experience on Workflow Manager Tools - Task Developer, Workflow & Worklet Designer.
- Experience in preparing Test Strategy, developing Test Plan, Detailed Test Cases, writing Test Scripts by decomposing Business Requirements, and developing Test Scenarios to support quality deliverables.
- Created Test Cases and developed Tractability Matrix and Test Coverage reports.
- Experienced in integration of various data sources like Sales force, Oracle, DB2, SQL server and MS access into staging area.
- Scheduled and monitored the nightly ETL process and ensure that clarity ETL process completes.
- Extensively used the VI editor for modification of UNIX scripts.
- Performed SDLC cycle of Data Integration, Unit testing, System Integration testing, Implementation, Maintenance and Performance tuning.
- Good experience on scheduling jobs using UNIX AUTOSYS.
- Expertise in SQL queries and Query Optimization, Report Testing techniques.
- Experience in Teradata and worked on Teradata utilities such as B.TEQ, MLOAD, FLOAD.
- Built report and dash boards using Business Objects. Expertise on Business Objects Universal Designer adding objects and classes and building reports through web intelligence.
- Participated in design and code reviews and verified compliance with the project’s plan.
TECHNICAL SKILLS:
ETL Tools: Informatica Power Center 9.6.1/9.1/9.0/8.6/8.5/7. x, Informatica Multidomain MDM, IBM InfoSphere (DataStage, QualityStage), Informatica Power Exchange, SSIS,, Reporting Service, IDQ, IDE, Data quality
Databases: Oracle 12g/11g/10g/9i, DB2 8.0/7.0, MS SQL Server2016, SYBASE, Teradata, SAP R/3, SALESFORCE MS Access 7.0/97/2000. SAS
Data Modeling: Dimensional Data Modeling, Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling Fact and Dimensions Tables, Physical and Logical Data Modeling, Erwin 4.0/3.5.2/3. x.
GUI: TOAD, Visual Basic 5.0/6.0, FrontPage 97/98/2000.
Programming: Visual Basic 6.0/5.0, PowerBuilder 6.0, C, PL/SQL, JavaScript, PERL, VBScript, HTML, XML, UNIX shell scripting, DHTML.
Design Tools: Erwin 4.5/4.0, Oracle Designer 2000.
Environment: Windows2000/XP/VISTA,10, UNIX AIX 5.2/5.3, LINUX, WinNT 4.0
PROFESSIONAL EXPERIENCE:
Confidential, Dayton, OH
ETL Developer
Responsibilities:
- Developed complex ETL and BI reporting using claims, enrollment and pharmacy data to support financial business operations.
- Involved in creating SSIS packages and dashboards using Power BI based on the client requirements for the Claims, Enrollment and Pharmacy data.
- Created SSIS packages using Merge, Aggregate, Sort, Multicasting, Conditional split, and derived column transformations in data flow and control flow of the package.
- Developed Informatica mappings, reusable transformations and created PL/SQL Stored procedures/packages for extracting data from source systems to staging and processing and loading to the data warehouse.
- Developed informatica mappings, mapping configuration task and Task flows using Informatica cloud service (ICS).
- Worked on the installation and setup ETL (Informatica cloud) applications on Linux servers.
- Involved in developing, configuring and deploying reports using SSRS.
- Maintaining records in excel sheets and exploring data in SQL server database.
- Unit testing the code and coordinating with Software Quality Analysis team and UAT team.
- Designed, developed and deployed reports and ad-hoc reports.
- Involved in developing reports for client management using Microsoft SQL server reporting services posted to internal Portal and deploying them.
- Developed Tabular Reports, ad-hoc reports, Sub Reports, Graphing, Data Drill-Down, and Data sorting/grouping and created various dynamic reports using SSRS Report Designer
- Created SSRS inventory management reports, focus on findings to save company millions of dollars in Client Members Performance Health management by providing Incentive, Claims, Biometrics file feeds. Identified high risk, Low risk, and Moderate risk members using SSRS dashboards.
- Created and managed schema objects such as tables, views, indexes, stored procedures, and triggers & maintaining Referential Integrity.
- Used DTS/SSIS and T-SQL stored procedures to transfer data from OLTP databases to staging area and finally transfer into data marts and performed action in XML.
- Developed Informatica mappings for data extraction and loading.
- Developed initial and incremental data loads in Informatica using Update strategy transformation.
- Worked with Expression, Lookup, Filter and Sequence generator and Aggregator Transformations to load the data from source to target.
- Participated in Testing and performance tuning by identifying bottlenecks in mapping logic and resolving them, setting cache values and creating partitions for parallel processing of data.
- Involved in continuous enhancements and fixing of production issues. Generated server side PL/SQL scripts for data manipulation and validation and materialized views for remote instances.
- Worked on Teradata utilities such as B.TEQ, MLOAD, FLOAD and write macros and functions.
- Worked on optimizing and tuning the Teradata views and SQL’s to improve the performance of batch and response time of data for users.
- Defines and executes manual test cases which test all aspects of the component under development.
- Prepared Test Strategy, developing Test Plan, Detailed Test Cases, writing Test Scripts by decomposing Business Requirements, and developing Test Scenarios to support quality deliverables.
- Created implementation plan for code deployment and involved in setting up builds and project deployment using Team City and Octopus.
- Involved in developing OLAP cubes by identifying tables (fact and dimension). Created OLAP cubes by using Fact and Dimension tables and views.
- Designed complex SSAS solutions using multiple dimensions, perspectives, hierarchies, measures groups. Designed OLAP cubes with star schema and multiple partitions using SSAS.
- Scheduled Jobs for executing the stored SSIS packages/Master Packages and automate the process weekly, monthly and daily based on the requirement.
- Scheduled ETL packages which allows batch processing to be based upon event driven execution rather than time based execution, which allows for more efficient use of the designated batch-processing windows.
Environment: Microsoft SQL Server Integration Service (SSIS), Power BI, Microsoft SQL Server reporting service(SSRS), SQL Server Analysis Service (SSAS), Informatica PowerCenter 9.6/10, PL/SQL, Teradata, Microsoft Excel, Team City, Octopus, Tidal, UNIX Autosys
Confidential, Cincinnati, OH
Business/Data Warehouse Analyst
Responsibilities:
- Business Analyst/Data Analyst in the Enterprise Data Warehouse (EDW) for the Commercial side of the bank.
- Liaison among stakeholders in order to define needs and recommend solutions that deliver value to stakeholders and enable the organization to achieve its goals.
- Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica PowerCenter.
- Experienced in Parsing high-level design specs to simple ETL coding and mapping standards.
- Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica PowerCenter/Data Quality (IDQ), peer reviewed their development works and provided the technical solutions. Proposed ETL strategies based on requirements.
- Worked with team to convert Trillium process into Informatica IDQ objects.
- Extensively worked on UNIX shell scripts for server Health Check monitoring such as Repository Backup, CPU/Disk space utilization, Informatica Server monitoring, UNIX file system maintenance/cleanup and scripts using Informatica Command line utilities.
- Worked on Informatica Data Quality (IDQ) toolkit, analysis, data cleansing, data matching, data conversion, address standardization, exception handling, reporting and monitoring capabilities of IDQ.
- Functioning liaison between multiple Business Lines, IT and Operations to analyze business and user needs, document requirements and resolve complex system problems throughout project cycle.
- Assist IT Infrastructure team in defining and implementing its business plan and goals to support its strategy. Centralizes the data collection and distribution of all Infrastructure and financial data.
- Provide complex and detail-oriented analysis and evaluation of financial and Infrastructure data.
- Partner with different business units including Finance, Commercial Loans, Mortgage, Annuities, Trust, Accounting; to gain a thorough knowledge base of the various business lines, business systems and industry requirements including the business plan, products, processes and revenue streams.
- Review operational procedures and methods and recommends changes for improvement with an emphasis on automation and efficiency.
- Work with multiple different business units and coordinating efforts across multiple projects in areas of research and analyzing business requirements.
- Provide guidance and context in prioritizing and determining complexity of multiple problems and requests.
- Manage monthly and ad-hoc business analysis and related activities while being responsible for creating management-level presentations.
- Ensure data process enhancements follow the appropriate IT guidelines, meet or exceed user requirements, and are completed in a timely fashion
- QA Responsibilities
- Work with resources both in house and third party to Identify data process enhancements, documents business needs and ensure development work is completed to specification.
- Creating Functional and Technical Specification documents for the requirements.
- Worked with SQL queries to test and dig into data.
- Consolidating CRM data and co-locating it to one area for consumption.
Environment: IBM Datastage, IBM DB2, PL/SQL, Tableau, SAP BO, SDLC, UNIX shell scripting, AGILE, Erwin, Autosys, Business objects, Windows XP.
Confidential, Montvale, NJ
Data Warehouse Analyst
Responsibilities:
- Involvement in High Level Design and Low Level Design document
- Involvement in Test Strategy and test case designing.
- Designed and developed ICS tasks and custom integration between Transaction system and Salesforce.com CRM.
- Built Mapplet / Template to be used within the ICS DSS Synchronization Tasks.
- Extensively worked on complex mappings, mapplets and workflow to meet the business needs and ensured they are reusable transformation to avoid duplications.
- Worked on various kinds of transformations like Expression, Aggregator, Stored Procedure, Java, Lookup, Filter, Joiner, Rank, Router and Update Strategy.
- Working with end customer to finalize the requirement.
- Creating BTEQ (Basic Teradata Query) scripts to generate Keys.
- Migration of ETL Codes across environments (Development, Test, Production).
- Development of UNIX scripts to run the Informatica workflow real time and responsible for fixing defects during development, SIT, UAT, post production.
- Create and maintain a project plan and update it on a weekly review with the IM and onsite coordinator of the Project.
- Worked on performance tuning.
- Knowledge sharing within the team whenever required.
Environment: Informatica Power Center 9.6.1/9.5, Informatica Cloud ICS, Oracle 12g, PL/SQL, SAP BW, Tableau, Teradata, SAP BO, SSRS, Windows, IDQ, IDE, SDLC, UNIX shell scripting, AGILE, Erwin, TOAD, Oracle, Autosys, Business objects, Windows XP.
Centene Corporation, St. Louis, MO
Informatica Developer
Responsibilities:
- Extensively used Informatica power Center 9.6.1 for ETL (Extraction, Transformation and Loading), date from relational tables.
- Used Protegrity to apply functions to reference to tokenize and de- tokenize the data.
- Walked through the Informatica and Teradata code to identify protected information references of columns like SSN, Medicaid number, Last name and first name.
- Worked on creating sessions in the workflows based on the requirement.
- Worked with data architecture teams to augment and define new structures.
- Extensively worked on making changes to the parameter files if needed. All the ETL code is in Linux scripts.
- Extensively used Tidal for scheduling the jobs when needed.
- Involved in upgrading Informatica Power Center 9.5.1 to 9.6.1.
- Written SQL queries to check whether the data is tokenized or not.
- Worked on Query banding to pull all the data required for tokenization from the repository.
- Deployed the code or changes made from Development to Test.
- Extensively coordinated with other departments of the company to make desired changes to the workflows.
- Involved in testing all the sessions, workflows, to check if the desired changes were made.
- Install, configure, maintain, and upgrade the Epic clarity console server.
- Created export scripts using Teradata Fast export Utility.
- Involved in Unit testing, System testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
Environment: Informatica Power Center 9.6.1/9.5, Informatica Cloud, Java, Teradata SQL assistant, PL/SQL, Tidal, Tableau, SAP BO, SSRS, Windows, IDQ, IDE, SDLC, UNIX shell scripting, AGILE, Erwin, TOAD, Oracle, Autosys, Business objects, Windows XP.
Confidential, Denver, CO
Informatica Developer
Responsibilities:
- Analyzed business documents and created system requirement specification.
- Extensively used Informatica power Center 9.5 for ETL (Extraction, Transformation and Loading), data from relational tables and flat files.
- Extensively worked on complex mappings, mapplets and workflow to meet the business needs and ensured they are reusable transformation to avoid duplications.
- Designed and developed star schema, snowflake schema and created fact tables and dimension tables for the warehouse and data marts using Erwin.
- Implemented Join, Expressions, Aggregator, Rank, Lookup, Update Strategy, Filter and Router transformations in mappings.
- Utilized the Informatica Data quality management suite (IDQ and IDE) to identify and merge customers and addresses.
- Responsible for creating complete test cases, test plans, test data , and reporting status ensuring accurate coverage of requirements and business processes
- Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mappings, build, unit testing, systems integration and user acceptance testing.
- Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- Design and schedule the Autosys process to execute daily, weekly and monthly jobs.
- Creating BTEQ (Basic Teradata Query) scripts to generate Keys.
- Performed data validation testing writing SQL queries
- Used Repository manager to create folders, which is used to organize and store all metadata in the repository.
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.5.1.
- I analyzed, designed and implemented ODS, Data marts, Data warehouse and operational databases.
- Migrating from Informatica 9.1.1 mappings into Informatica 9.5.1 which consists of grid technology.
- Created Dimensional and Relational Physical & logical data modeling fact and dimensional tables using Erwin.
- Understanding the domain and nodes as well as by using the Informatica integration service to run the workflows in 9.5.1.
- Design and development of the Informatica workflows/sessions to extract, transform and load the data into Target. Created database triggers for Data Security.
- Developed Informatica mappings, re-usable transformations, re-usable mappings and Mapplets to data into data warehouse.
- Developed shell scripts for Daily and weekly Loads and scheduled using Unix Maestro utility.
- Created export scripts using Teradata Fast export Utility.
- Involved in writing SQL scripts, stored procedures and functions and debugging.
- Involving in Functional Testing & Regression Testing
- Responsible for providing comments for user stories within an AGILE software development SCRUM environment.
- Created sessions and batches and tuned performance of Informatica sessions for large data files by increasing the block size, data cache size and target based commit interval.
- Prepared test data by modifying the sample data in the source systems, to cover all the requirements and scenarios.
- Used debugger to test mapping at designer level.
- Developed email routines to indicate failure or successful completion of workflows.
- Experienced in retest the existing test cases with the different kind of source systems for different periods of data.
- Create and maintain daabase objects, especially as related to Epic and system updates and upgrades.
- Coordinator for Data model upgrades for Epic system for version upgrades and quarter upgrades.
- Created configured and scheduled the sessions and Batches for different mappings using workflow manager and using UNIX scripts.
- Involved in upgrading from Informatica Power Center 8.6 to 9.5.1.
Environment: Informatica Power Center 9.5/8.6, IBM InfoSphere DataStage, Power Mart, Oracle Oracle11g/10g, PL/SQL, Cognos, SAP BO, SSRS, Windows, IDQ, IDE, SDLC, UNIX shell scripting, AGILE, RUP, UML,Erwin, TOAD, Teradata, Autosys, mercury Quality center, Business objects, Windows XP.
Confidential
Developer / Tester
Responsibilities:
- Developed and supported the Extraction, Transformation, and load process (ETL) for data migration using Informatica power center.
- Responsible for developing Source to Target Mappings.
- Extensively used Informatica Client tools- Source Analyzer, Warehouse Designer, Mapping Designer.
- Developed Informatica mappings for data extraction and loading worked with Expression, Lookup, Filter and Sequence generator and Aggregator Transformations to load the data from source to target.
- Conceptualized and developed initial and incremental data loads in Informatica using Update strategy transformation.
- Sourced data from Teradata to Oracle using Fast Export and Oracle SQL Loader.
- Developed mappings, sessions for relation and flat source and targets.
- Developed Single and multiple dashboards and scorecards using business objects.
- Imported data from various source (Oracle, Flat file, XML) and transformed and loaded into targets using Informatica.
- Written SQL queries to access the data in the Mainframe DB2 database.
- Monitored Workflows and Sessions
- Developed Unit test cases for the jobs.
- Identified the facts and dimensions and designed the relevant dimension and fact tables
Environment: Informatica 8.6, Oracle 9i, Erwin 4.0, IDQ, IDE, Teradata, PL/SQL, UNIX shell scripting, SQL Server 2005, Business Objects 6.0, DB2, Autosys , UNIX and Windows NT