We provide IT Staff Augmentation Services!

Etl / Informatica Developer Resume

2.00/5 (Submit Your Rating)

Wilton, CT

SUMMARY:

  • Around 7+ years of IT experience in all stages of Software Development Life Cycle (SDLC) Business/Data analysis, ETL Informatica Development, Data Modeling, Project Management, Data Modeling, Data Mapping, Build, Unit testing, System Integration and User Acceptance Testing.
  • Experience in Information Technology with a strong background in Database development and strong ETL skills for Data warehousing using Informatica.
  • Good SQL skills and ability to write and interpret complex SQL statements and also skillful in SQL optimization and ETL debugging and performance tuning.
  • Have Business knowledge on Dynamic CRM for Electric and Gas customers, payroll and Insurance for production employees.
  • Proficient in the Integration of various data sources with multiple relational databases like MS SQL Server, MYSQL, Oracle11g, DB2, XML files and Flat Files into the staging area, Flat Files into the Dynamics CRM, Data Warehouse and Data Mart.
  • Experience in developing of on - line transactional processing (OLTP), operational data store (ODS) and decision support system (DSS) (e.g., Data Warehouse) databases.
  • Strong familiarity with master data and metadata management and associated processes.
  • Hands-on knowledge of enterprise repository tools, data modeling tools, data mapping, tools, data profiling tools, and data and information system life cycle methodologies.
  • Proficient with many different types of RDBMS, such as MySQL, MS SQL, PostgreSQL and good knowledge in NoSQL databases like MongoDB and Cassandra.
  • Experienced in Oracle, MySQL, MS SQL, DB2 databases design, PL/SQL application development and back-end development using DDL, DML commands, SQL Queries, Table Partitioning, Collections, Import and Export data, Stored Procedure, Cursor, Functions and Triggers.
  • Good experience in writingSQL scripts for development, automation of ETL process, error handling and auditing purposes.
  • Good experience in Business and Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Integration and Configuration Management.
  • Expertise in all areas of software development methodologies (Waterfall and Agile) including client interaction, requirements gathering, analysis and tele-conferencing with the client during the progress of the project .
  • Made use of all stages of Data Aggregation, Sorts, Merges, Joins, Filter, Union in Data Loading jobs.
  • Dimensional data modeling using Data Modeling, Star Join Schema/Snowflake modeling, fact and dimensions tables, physical and logical data modeling.
  • Involved in Logical and Physical Design, Backup, Restore, Data Integration and Data Transformation Service.
  • Good knowledge of Java Object Oriented Programming concepts (OOPS).
  • Have delivered large-scale solutions, coordinated projects with both Onshore and Offshore teams.
  • Involved in daily Scrum meetings to keep track of the ongoing project status and issues.
  • Used GIT for version control and regularly pushed the code to GitHub and Bitbucket.
  • Ability to work independently as well as work with teams having varying backgrounds on complex issues and have strong verbal and written communication skills.
  • Independent, enthusiastic team player with strong adaptability to new technologies.
  • Excellent communication & interpersonal skills with proven abilities in resolving complex software issues.
  • Good Knowledge of Hadoop Ecosystem (HDFS, HBase, Hive, Pig, NoSQL etc.) and Data modelling in Hadoop environment.

TECHNICAL SKILLS:

ETL: - Informatica PowerCenter 10.0.1, 9.5.1, 9.0, 8.1.1,SAP Data Services 4.2

Data Profiling Tools: - Informatica IDQ 10.0,9.5.1, 8.6.1

ETL Scheduling Tools: - Autosys, Tivoli, Control M, ESP.

RDBMS: - DB2, Oracle 11g/12c, SQL Server 2008/2012, MySQL, PostgreSQL

Data Modeling: - ER (OLTP) and Dimensional (Star, Snowflake Schema);

Data Modeling Tools: - Erwin 9.3/7.5

Scripting: - UNIX Shell scripting

Reporting Tools: - Tableau 9, Cognos 8x/9x

Operating Systems: - Windows XP/2000/9x/NT, UNIX

Source Management: - BitBucket, GIT

Programming Languages: - C, C++, PL/SQL, Python, HTML, CSS, Java

Other Tools: - Notepad++, Toad, SQL Navigator, JIRA

EXPERIENCE:

Confidential - Wilton, CT

ETL / Informatica Developer

Responsibilities:

  • Analysed the business requirements and functional specifications.
  • Loading data to the staging tables following the Data Integrity rules of Interface Design Specifications.
  • Used source external tables to load the data to the staging tables.
  • Created mapping documents to outline data flow from sources to targets.
  • Modified existing mappings for enhancements of new business requirements.
  • Used Debugger to test the mappings and fixed the bugs.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Created Pre/Post Session/SQL commands in sessions and mappings on the target instance.
  • Interact with the vendor and set up the SFTP connection to the vendor’s ftp site for transferring extracted files.
  • Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the Business requirements.
  • Developed and scheduled the Jobs using Autosys Scheduler.

Environment: Informatica PowerCenter 9.0.1, Flat files, Excel files, Oracle11g, Erwin, Autosys, UN

Confidential, Los Angeles, CA

ETL/ Informatica Developer

RESPONSIBILITIES:

  • Using Informatica Power centre tools developed Workflows using task developer, work lets designer, and workflow designer in Workflow manager and monitored the results using workflow monitor.
  • In EDW we load the consignment related data based business rules, this data we extract from SAP source system.
  • Based on requirements, developed Source-To-Target mapping document with business rules and also developed ETL Spec documentation.
  • Design and develop methodologies to migrate multiple development/production databases from Sybase to Oracle 11g.
  • Implemented / Developed incremental ETL mappings.
  • Wrote heavy stored procedures using dynamic SQL to populate data into temp tables from fact and dimensional tables for reporting purpose.
  • Implemented Talend POC to Extract data from Salesforce API as an XML Object & .csv files and load data into SQL Server Database.
  • The technologies in use included Provia Viaware (WMS), EDI (Trusted Link), JD Edwards, iSQL, Oracle, SQL, Unix,Cron(JobScheduler)&ShellScripting.
  • Tuned the performance of mappings by following Informatica best practices and also applied several methods to get best performance by decreasing the run time of workflows.
  • Created logical and physical data models using Erwin, process flow diagrams and data mapping documents.
  • Extracted data from Teradata, BIG Data, and Oracle No SQL and Oracle Exadata databases.
  • Prepared various mappings to load the data into different stages like Landing, Staging and Target tables.
  • Writing HIVE table scripts using AWS S3 for L1, L2, L3.
  • Excellent knowledge of HIPAA standards, EDI (Electronic data interchange), EDIFACT, Implementation and Knowledge of HIPAA code sets, ICD-9, ICD-10 coding and HL7.
  • Extensively used Perl scripts to edit the xml files and calculate line count according to the client's need.
  • Worked on optimizing and tuning the Teradata views and SQL’s to improve the performance of batch and response time of data for users.
  • Involved in Data Integration and Migration by using SalesForce.com Apex data loader, web based import wizard.
  • Thousands of enterprises worldwide depend on Informatica data integration, data quality, and big data solutions to access, integrate, and trust their information assets residing on premise and in the Cloud.
  • Used Informatica power center 9.6.1 to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files.
  • Wrote PL/SQL scripts for pre & post session processes and to automate daily loads.
  • Expertise in developing applications, batch processes using PERL, Shell Scripting, JavaScript since two years.
  • Involved in Analyzing/ building Teradata EDW using Teradata ETL utilities and Informatica.
  • Write Shell script running workflows in UNIX environment.
  • Modified several of the existing mappings based on the user requirements and maintained existing mappings, sessions and workflows. currently I am working on a 4 node cluster running on AWS.
  • Installed and configured Pentaho BI Server 3.6/3.7 on Red Hat Linux and Windows Server.
  • Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
  • Developed mapping in creation of EDI 834 file format using informatica XSD's.
  • Provide Application Maintenance & Support on Viaware (WMS), JD Edwards (World soft) & EDIapplication.
  • Prepared SQL Queries to validate the data in both source and target databases.

ENVIRONMENT: Informatica Power Center 10/9.6, EBS, Informatica BDE,Hive 2.7,HL7,Teradata 12, SSRS, Oracle 11/10g, PL/SQL,Jitterbiy,Perl Scripting,SSAS, Autosys, TOAD 9.x,Oracle Financials, Shell Scripting,python, Dynamic SQL, Oracle SQL *Loader, SSIS 2008 and Sun Solaris UNIX, OBIEE, Windows-XP.

Confidential, Burbank, CA

Informatica Developer

Roles & Responsibilities:

  • Develop complex mappings by efficiently using various transformations, Mapplets, Mapping Parameters/Variables, Mapplet Parameters in Designer. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer, Sequence generator and SQL.
  • Demonstrated the ETL process Design with Business Analyst and Data Warehousing Architect.
  • Assisted in building the ETL source to Target specification documents.
  • Worked on SQL coding for overriding for generated SQL query in Informatica.
  • Involvedin analyzing and validating of the data from different data sources.
  • Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions.
  • Wrote Complex SQL queries for Data manipulation, insertion, deletion and updates.
  • Experience dealing with partitioned tables and automating the process of partition drop and create in MySQL and MS SQL databases.
  • Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
  • Performed Data Conversion/Data migration using Informatica PowerCenter.
  • Prepared different Ad-Hoc reports as per the client requirement which helped in business enhancements.
  • Involve in performance tuning for better data migration process.
  • Document and present the production/support documents for the components developed, when handing-over the application to the production support team.
  • Developed Unix scripts for processing Flat files.
  • Prepared Test Data and loaded it for Testing, Error handling and Analysis.
  • Created an Issue Log to identify the errors and used it for preventing any such errors in future development works.
  • Developed schedules to automate the update processes and Informatica Sessions/Batches.
  • Worked on the production code fixes and data fixes.
  • Responsible to troubleshoot the problems by monitoring all the Sessions that are scheduled, completed, running and used Debugger for complex problem troubleshooting.
  • Provided knowledge transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the Mappings.
  • Worked with Application support team in the deployment of the code to UAT and Production environments.
  • Involved in production support working with various mitigation ticket created while the users working to retrieve the database.
  • Worked on deploying metadata manager like extensive metadata connectors for data integration visibility, advanced search and browse of metadata catalog, data lineage and visibility into data objects, rules, transformations and reference data.
  • Used materialized views to create snapshots of history of main tables and for reporting purpose.

Confidential, Dallas, TX

IDQ ETL Developer

RESPONSIBILITIES:

  • Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data Quality (IDQ).
  • Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Data Profiling, Cleansing, Standardizing using IDQ and integrating with Informatica suite of tools.
  • Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities using IDQ 9.6.1.
  • Extracted data from oracle database and spreadsheets and staged into a single place and applied business logic to load them in the central oracle database.
  • Created connections in Informatica power center to retrieve data from different sources by using Java API.
  • Worked on Cognos 10 (Business Insight and Active Reports) for two demo projects to demonstrate the End-User the new features available in Cognos 10. Built Portal pages for them.
  • Worked on performance tuning of programs, ETL procedures and processes.
  • Involved in Dimensional modelling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Automate the on boarding of new external data sources or trading partners
  • Automate the processing of unstructured data formats (PDF, XLS, DOC, etc)
  • Worked with Informatica Data Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
  • Identified and eliminated duplicates in datasets thorough IDQ 8.6.1 components of Edit Distance, Jar Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
  • Involved in developing application using sql and wrote queries to test the data that was sent through the API .
  • Configured UNIX shell scripts for executing DataStage jobs.
  • Used Informatica Power Center for extraction, transformation and load (ETL) of data in the data warehouse.
  • Extensively used Transformations like Router, Aggregator, Normaliser, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.
  • Developed complex mappings in Informatica to load the data from various sources.

ENVIRONMENT: Informatica Power Center 9.0/8.x/7.1.3, Informatica Data Quality, Informatica MDM, Power Exchange 8.6.1, Data Explorer 5.0, Oracle 10g/9i, PL/SQL, SSIS/SSRS, Toad 10.5/9.5, Cognos 8.4, Power BI, Puppet, Windows XP pro and AIX UNIX, PVCS, Tidal, Magic ticket management, SQL Server, Teradata SQL Assistant, Teradata external loaders (T pump, Ml oad).

Confidential, Union, NJ

ETL Developer

Responsibilities:

  • Worked on Power Centre client tools like Source Analyzer, Warehouse Designer, Analyzed the system, met with end users and business units in order to define the requirements
  • Extracted the data from Oracle, SQL Server and load into Data warehouse.
  • Wrote SQL Queries, Triggers, PL/SQL Procedures, Packages and Shell Scripts to apply and maintain the business rules.
  • Translated business requirements into Informatica mappings/workflows.
  • Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
  • Used Informatica Designer to create complex mappings using different transformations like filter, Router, lookups, stored procedure, joiner, update strategy, expression and aggregator transformations to pipeline data to Data Warehouse/Data Marts.
  • Performance Tuning of existing mappings.
  • Developed Mappings that extract data form ODS to Data mart and Monitored the Daily and Weekly Loads.
  • Created and monitored Sessions/Batches using Informatica Server Manager/Workflow Monitor to load data into target Oracle database.
  • Worked with Mapping parameters and variable functions like Set variable, Count variable, Set invariable and meta variable.
  • Migrated existing mappings to production environment.
  • Designed the reports and Universes as per the requirements.

Environment: Informatica PowerCenter8.6, Oracle 9i, PL/SQL, SQL Server, Toad, Windows NT, UNIX Shell Scripts, Business Objects.

We'd love your feedback!