Senior Etl Developer Resume
SUMMARY:
- Data warehousing and Business intelligence with emphasis on Business Requirements Analysis, Application Design, Development, Testing, Deployment, Implementation and Maintenance of client/server Data Warehouse, Data Mart systems and administration of tools.
- Experience in development and design of ETL methodology for supporting data transformations and processing, in a wide ETL Solution using Informatica Power Center10.1 and Autosys.
- Architecture tasks including data models, reverse engineering, and performance optimization.
- Design, develop, unit test and maintain ETL jobs to create or enhance data warehouses/marts.
- Design ETL via source - to-target mapping and design documents that consider security, performance tuning and other best practices.
- Worked on Data Cleansing, Data Profiling and expertise on Data Quality analysis.
- Test ETL and other technical components and support QA activities.
- Deploy and test code from lower environments to production.
- Document technical specifications, data models, process flows, etc.
- Identify improvement areas and propose solutions for the same.
- Understand the current application with minimal guidance.
- Expertise in documenting technical processes by understanding existing application.
- Collaborate with delivery and technical team members on design and development activities within Agile/Scrum teams.
- Migration of On Prem Informatica PowerCenter servers by migrated the code to Informatica Cloud Services
- Collaborate with business partners to understand business processes, standard release management requirements, and underlying data and reporting needs.
- Collaborate with IT/enterprise partners to understand processes and standard release management requirements.
- Conduct data analysis in support of ETL development, issue analysis and resolution, and activities.
- Worked in PL-SQL and experienced in loading data into Data Warehouse/Data Marts using Informatica, SQL *Loader.
- Createdstored procedures, packages, Functions, Triggers and DB objects.
- Worked in Prod Support - Monitor the jobs in Prod, fix in case of failures based on severity and load the data.
- Experience in Performance tuning using pushdown optimization and session partitioning.
- Experience using Oracle 11g/10g/9i/8i/, MS SQL Server 2008/2005, DB2, PL/SQL, SQL *Plus, SQL *Loader and Developer 2000. Hands on Working knowledge in Oracleand PL/SQL Writing Stored Procedures, Packages, Functions and Triggers.
- Experience with TOAD, AQT, SQL Developer and Data Studio tools to test, modify and analyze the data, create indexes, and compare data from different schemas.
- Experienced in Unix Shell scripting using the environment variables, LINUX commands and PL/SQL procedures and Share point. Created multiple Korn Shell scripts to automate& schedule the batch processes, to check disk space and ftp files.
- Worked in the Scheduling tools like Autosys (creating JIL scripts) and Control - M and testing tools like Quality Center.
- Strong knowledge of Software Development Life Cycle (SDLC) with industry standard methodologies like Waterfall, Agile, and Scrum methodology including Requirement analysis, Design, Development, Testing, Support and Implementation.
TECHNICAL SKILLS:
Data Warehouse Tools: Informatica Power Center 9.6/9.5/8.x/7.x, Data Stage, Informatica Power Analyzer/Powermart 9x
Operating System: Windows XP/Vista 08/07, Unix
Databases: Unix, Linux, Oracle, SQL server, PL/SQL,DB2, T-SQL
Reporting Tools: SSRS, SSAS
Dimensional Data Modeling: Dimensional Data Modeling, Tableau Server, Star Join Schema Modeling, Snow-Flake, Modeling, Erwin.
Data Base Tools: Informatica Power Center, ILM TDM, Autosys, MJS, Toad, JIRA, Quality Center
Languages: C, C++
Dimensional Data Modeling: Dimensional Data Modeling, Star & Snow Flake modeling, Erwin
PROFESSIONAL EXPERIENCE:
Confidential
Senior ETL Developer
Responsibilities:
- Created POC (proof of concept) for IDQ.
- Created LDO’s (Logical Data objects), PDO’s (Physical data objects) and profiles for each and every particular JIRA stories.
- Created scorecards in Informatica Analyst to validate the valid and invalid rows.
- Monitoring the Scorecard timings in the Informatica Monitoring tool.
- Worked on Data trends to maintain Data Quality maturity, Data preparations activity, Overall Data Quality responsibility and Excel and SQL usage.
- Performed data remediation for cleansing, organizing and migrating data so it's fit for purpose or use and detected and corrected (or removing) corrupt or inaccurate records by replacing, modifying or deleting the “dirty” data.
- Worked on IBM data studio to pull the Valid and Invalid records.
- Created various mappings using different transformations in LDO’s such as Union, Sorter, Joiner, Expression, Aggregator etc., transformations.
- Used SSIS (SQL server Integration services) Designer to create a simple Microsoft SQL Server Integration Services package and the package takes data from a flat file, reformats the data, and then inserts the reformatted data into a fact table.
- Extensively used SSIS tool to automate maintenance of SQL Server databases and updates to multidimensional cube data.
- Data integration with SFDC and AWS using Informatica cloud.
- Migration of On Prem Informatica PowerCenter servers by migrated the code to Informatica Cloud Services
- Informatica Cloud and exposed them as RESTful API services to publish data to external systems.
- Gathering Business requirements and creating technical specifications along with creating Internal and External Design documents.
- Create Technical specification documents and work with Business Analysts on analysis of multiple source systems data as part of ETL mapping exercise
- Conduct meetings with BSA team to understand the FRD (Functional Requirement Documents)
- Design matching plans, help determine best matching algorithm, configure identity matching and analyze duplicates using Informatica Data Quality (IDQ).
- Maintain the quality in development and deliverables with automated and peer code reviews.
- Implement best practices and standards in code and ensure code should address exception handling, comments, performance, and scalability.
- Analyzed the Business Requirement Documents (BRD) and laid out the steps for the data extraction, business logic implementation & loading into targets.
- Responsible for Impact Analysis, upstream/downstream impacts.
- Knowledge of FACETS healthcare application, Provider data and Vendor Data
- Created detailed Technical specifications for Data Warehouse and ETL processes.
- Used Informatica as ETL tool, and stored procedures to pull data from source systems/ files, cleanse, transform and load data into the Teradata using Teradata Utilities.
- Knowledge of working on Data warehouses and Data marts.
- Create Exclusion Reports with SSRS for all the fallout providers.
- Worked on Informatica- Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformation Developer.
- Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Filter, Connected and Unconnected Lookups, Joiner, update strategy and stored procedure.
- Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.
- Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
- Successfully upgraded Informatica 9.5 and to 9.6 and responsible for validating objects in new version of Informatica.
- Implemented the business rules and extracted the data from various sources such as SQL Server CA Secure and EPDS R6 (DB2) loaded the required data into flat files as per the specifications.
- Full lifecycle exposure to Data Warehouse projects and Data marts with Star and Snowflake Schemas.
- Involved in Initial loads, Incremental loads and Daily loads to ensure that the data is loaded in the tables in a timely and appropriate manner.
- Extensively worked in the performance tuning of Teradata SQL, ETL and other processes to optimize session performance.
- Loaded data in to the Teradata tables using Teradata Utilities Bteq, Fast Load, Multi Load, and Fast Export, TPT.
- Worked extensively with different Caches such as Index cache, Data cache and Lookup cache (Static, Dynamic and Persistence) while developing the Mappings.
- Created Reusable transformations, Mapplets, Worklets using Transformation Developer, Mapplet Designer and Worklet Designer.
- Integrated the data into centralized location. Used migration, redesign and Evaluation approaches.
- Responsible for Unit Testing, Integration Testing and helped with User Acceptance Testing.
- Tuned the performance of mappings by following Informatica best practices and also applied several methods to get best performance by decreasing the run time of workflows.
- Worked extensively on Informatica Partitioning when dealing with huge volumes of data and also partitioned the tables in Teradata for optimal performance.
- Scheduling Informatica jobs and implementing dependencies if necessary using Autosys.
- Managed postproduction issues and delivered all assignments/projects within specified time lines.
Environment: Informatica Power Center 10.1, Oracle 11g, DB2, Teradata, Flat Files, Erwin 4.1.2, SQL Assistant, Toad, Winscp, Putty, Autosys, UNIX
Confidential, TX
ETL Developer
Responsibilities:
- Work closely with IT and the Business group to understand business reporting requirements and analyze logical model, and develop subject matter expertise in a short time.
- Participate in development and execution of tactics and strategies to optimize data quality in data warehouse, and OLAP environment.
- Worked on Informatica Power Center tools - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets and Reusable Transformations.
- Developed Informatica Mappings and Sessions based on user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- Developed multiple Teradata BTEQ to load data into target tables.
- Maps sources system data elements to target systems and develops, tests, and support extraction, transformation and load processes - define mapping, sessions and workflow
- On call support for Informatica.
- Monitor the day-to-day activity of the Informatica jobs.
- Worked along with UNIX team for writing UNIX shell scripts to customize the server scheduling jobs.
- Troubleshooting issues related to databases, job failures, software bugs and performance bottlenecks to all incidents working with multiple cross functional teams like DBA, Unix Server Administrator, Informatica Global Support and work on problems and drive them to resolutions.
- Provided technical documentation as a part of support/upgrade/maintenance.
Environment: Informatica PowerCenter 9.6, Shell Scripts, Teradata 15, SQL, UNIX, Toad, SQL Developer, Cognos 9
Confidential, NC
Informatica Developer
Responsibilities:
- Involved and understanding the Business requirements/ discuss with Business Analyst, analyzing the requirements and preparing business rules.
- Design and Developed complex mappings by using Lookup transformation, Expression, Sequence generator, Update, Aggregator, Router, Stored Procedure to implement complex logics while create mappings.
- Developed mappings using Informatica to load the data from sources such as Relational tables, Flat files, Oracle tables into the target Data warehouse.
- Developed mappings/Transformation/ mapplets by using mapping designer, transformation developer and mapplets designer using Informatica PowerCenter.
- Extensively worked with transformations like Lookup, Expression, Router, Joiner, Update Strategy, Filter, and Aggregate.
- Extensively worked on SQL override in Source Qualifier Transformation for better performance.
- Hands on experience using query tools like SQL Developer, PLSQL developer.
- Designed and developed stored procedures using PL/SQL and tuned SQL queries for better performance.
- Implemented slowly changing dimension (SCD Type 1 & 2) in various mappings.
- Created sessions, database connections and batches using Informatica Server Manager/Workflow Manager.
- Extensively used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic implanted in the mappings.
- Coordinated all ETL (Extract, Transformation and loading) activities and enhancements using Agile Methodology.
- Extensively used Informatica debugger to find out the problems in mapping. Also involved in troubleshooting and rectify the bugs.
- Involvement in all phases of software development life cycle in this project, which includes but not limited to Requirement gathering, analysis, technical design, development, testing, and user acceptance testing.
Environment: Informatica PowerCenter 9.6, Oracle 11g, UNIX, XML, PLSQL, Windows XP/Vista, Quality Center.
Confidential, MI
ETL Consultant
Responsibilities:
- Responsible for resolving report, data integration, design issues, enhancing and maintaining retail management analytics.
- Interacted with Business Analysts on Mapping documents and Design process for various requirements
- Worked on Power Center Tools like designer, workflow manager, workflow monitor and repository manager.
- Enhanced mappings in Informatica using a variety of PowerCenter transformations, Mapping Parameters, Parameter files in Mapping Designer using Informatica PowerCenter.
- Worked extensively on Informatica transformations like Source Qualifier, Joiner, Expression, Filter, Router, Aggregator, Lookup, Update strategy.
- Implemented Sync back mechanism to log the error out records and load them for the next run.
- Deployed reusable transformation objects like sessions and tasks to avoid duplication of metadata, reducing the development time.
- Implemented performance and query tuning on all the objects of Informatica.
- Worked on several extracts and loads type of mappings.
- Worked with SQL Override in the Source Qualifier and Lookup transformation.
- Extensively used various Functions like LTRIM, RTRIM, ISNULL, ISDATE, TO DATE, Decode, Substr, Instr and IIF function.
- Accelerating data integration and data migration into CRM application, Salesforce.
- Used Update Strategy DD INSERT, DD UPDATE to insert and update data for implementing the Slowly Changing Dimension Logic.
- Developed SCD 2 to capture new changes while maintaining the historic information.
- Reduced the amount of data moving through flows to have a tremendous impact on the mapping’s performance.
- Used Debugger to test the mappings and fixed the bugs.
- Prepared detail documentation for the developed code for QA to be used as guide for future migration work.
Environment: Informatica Power Center 9.6.1 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Teradata SQL Assistant 15.x, TOAD12.8, Oracle 12c, Salesforce, WinSCP, Control-M, Shell Scripting