We provide IT Staff Augmentation Services!

Etl/informatica Developer Resume

3.00/5 (Submit Your Rating)

Durham, NC

SUMMARY

  • 8 years of experience in using Informatica PowerCenter 10.4.1, 9.x, 8.x, IDQ, Informatica ETL Developer in full life cycle of Software Development (SDLC) of requirements gathering, Analysis, Application Design, Development, Testing, Implementation, System Maintenance, Documentation, and support of Data Warehousing applications.
  • 7 Years of experience in requirements gathering, design, development implementation and testing of data warehousing and business intelligence applications.
  • Databases of experience using Oracle, DB2, MS SQL Server, Teradata, MYSQL, ORACLE 11g/10g.
  • Experience withInformaticaPower Centre 10.4.1, 9.x, 8.x IDQ 9.6.1, Teradata, SQL server, oracle databases.
  • Experienced in debugging and Performance tuning of the Informatica mappings, sessions, and workflows.
  • Experience in developing ETL program for supporting Data Extraction, transformations and loading using Informatica Power Center.
  • Hands on experience in identifying and resolving performance bottlenecks in various levels like sources, using Extract Transform and Load (ETL) and strong understanding of OLTP, OLAP concepts.
  • Experience in Data Modeling & Data Analysis experience using Dimensional Data Modeling and Relational Data Modeling, Star Schema/Snowflake Modeling, FACT & Dimensions tables, Physical & Logical Data Modeling.
  • Worked and has good knowledge in Agile and Waterfall mode of Software Development methodology.
  • Experienced in writing SQL, Oracle, PL/SQL programming, Stored Procedures, Package, Functions, Triggers, Views, Materialized Views, cursors and tuning the SQL query.
  • Experience writing daily batch jobs using UNIX shell scripts and developing complex UNIX Shell Scripts for automation of ETL. Korn or Bash Shell scripting experience in a Unix / Linux environment.
  • Experience with Teradata utilities like Fast Load and Multi Load and Tpump and Teradata Parallel transporter and highly experienced in Teradata SQL Programming. Expert in performance tuning and dealing with huge volume of data.
  • Expertise in using Teradata database and utilities like TPT, FASTLOAD, MULTILOAD, and BTEQ scripts.
  • Proficient in implementing complex business rules through different kinds of Informatica transformations, Workflows/Worklets and Mappings/Mapplets.
  • Optimized and updated logical & physical data models to support new and existing projects. Maintained conceptual, logical, and physical data models along with corresponding metadata.
  • Knowledge in Big Data tools Hadoop ecosystem components like Hadoop Map Reduce, HDFS, Hive, Sqoop, Apache Spark and Kafka.
  • Strong knowledge in RDBMS concepts, Data Modeling (Facts and Dimensions, Star/Snowflake Schemas), Data Migration, Data Cleansing and ETL Processes.
  • Knowledge analyzing user requirements, procedures, and problems to automate processing or to improve existing job flows and schedules systems.
  • Performed data conversion from Flat files to Oracle using OracleSQL* Loader.
  • Designed and developed approaches to acquire data from new sources like Mainframe (DB2), and AS400 (DB2).
  • Maintained documentation for all processes implemented.
  • Experienced in Informatica ETL Developer role in Data Warehouse projects, Enterprise Data Warehouse, OLAP and dimensional data modeling.
  • Good understanding of Amazon AWS Architecture, EC2, S3 bucket AWS CLI commands.
  • Used SQL queries to identify data issues, data fixes, manual extracts, etc.
  • Data Researching, Data Collection, Quality Assurance, Analysis and Problem - Solving Skills.
  • Worked with offshore team for testing jobs dat are developed

TECHNICAL SKILLS

Tools: InformaticaPower Center 10.4.1,InformaticaPower Exchange,IICS, InformaticaData Quality 10.2.2 andInformaticaAnalyst tool 10.2.1, Informatica BDM, Talend

Languages: XML, UML, HTML, C, C++, UNIX, Shell Scripting, python 3.7, Power shell.

Database: Oracle, SQL Server, IBM DB2, MS Access, Teradata, Snowflake, MYSQL, Postgres, Oracle, Hadoop, ANSI SQL, AS400, PL/SQL, T-SQL.

Reporting Tools: Erwin Data Modeler, Tableau, Power BI, SSRS

Other Tools: SQL Loader, SQL Plus, Query Analyzer, Putty, MS Office, MS Word, S3, Control-M, AUTOSYS, Salesforce, Lambda.

Teradata Tools & Utilities: BTEQ, Multi Load, Fast Load, Fast Export, Tpump, Teradata Manager, SQL Assistant

PROFESSIONAL EXPERIENCE

Confidential, Durham, NC

ETL/Informatica Developer

Responsibilities:

  • Responsible for Business Analysis and Requirements Collection.
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Designed and customized data models for Data warehouse supporting data from multiple sources on real time
  • Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Created mapping documents to outline data flow from sources to targets.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions, and measured facts.
  • Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Developed mapping parameters and variables to support SQL override.
  • Created mapplets to use them in different mappings.
  • Developed mappings to load into staging tables and tan to Dimensions and Facts.
  • Used existing ETL standards to develop these mappings.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
  • Modified existing mappings for enhancements of new business requirements.
  • Used Debugger to test the mappings and fixed the bugs.
  • Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
  • Involved in Performance tuning at source, target, mappings, sessions, and system levels.
  • Prepared migration document to move the mappings from development to testing and tan to production repositories.

Environment: Informatica Power Center 10.4.1, Workflow Manager, Workflow Monitor, Informatica Power Connect / Power Exchange, Data Analyzer 8.1, PL/SQL, Oracle 10g/9i, Autosys, SQL Server 2005, Sybase, UNIX AIX, Toad 9.0, Snowflake.

Confidential, Seattle, WA

ETL Developer/Data Engineer

Responsibilities:

  • Analyzed the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using InformaticaPowerCenter 10.4.0
  • Worked on Agile Methodology, participated in daily/weekly team meetings, worked with business and technical teams to understand data and profile the data.
  • Understanding transformation and techniques especially with prem source to Attunity.
  • Expert in designing Parallel jobs using various stages like Join, Merge, Lookup, remove duplicates, Filter, Dataset, Lookup file set, Complex flat file, Modify, Aggregator, XML.
  • Update Strategy, Aggregator, Expression, Joiner Transformations and tan loaded into data Warehouse using Informatica 10.4.0
  • Experienced in Snowflake advanced concepts like setting up resource monitors and performance tuning.
  • Involved in creating Unixshell scripts for Informaticaworkflowexecution.
  • Data extracted the information from various disparate prem sources including but not limited to SQL, Teradata, DB2, MSSQL, and flat files and loaded into destination or used directly for profiling.
  • Expertise includes taking any incoming data set and applying various data quality logic to it as per business needs.
  • Analyzing data using Snowflake query window Design and Big Data Quality Rules
  • Extensively worked on Informatica transformations such as Source Qualifier, Joiner, Filter, Router, Expression, Lookup, Aggregator, Sorter, Normalizer, Update Strategy, Sequence Generator and Stored Procedure transformations.
  • Migrated data from legacy systems SQL server 200, AS400 to Snowflake and SQL server.
  • Extensively used the Informatica BDM transformations like Address validator, Exception, Parser, Solid experience in debugging and troubleshooting Sessions using the Debugger and Workflow Monitor.
  • Used SQL scripts and AWS resources (Lambda, Step Function, SNS, S3) to automate data migration.
  • Expertise in deploying Snowflake features such asdata sharing, events, and lake-house patterns.
  • Worked with multiple divisions throughout the organization to conform with best practices and standards.
  • Created connections including Relational connection, Native connections, and Application connections.
  • Experience in GDW Teradata BTEQ scripts and enhancements based on new requirements.
  • Involved in Performance tuning of sessions, mappings, ETL procedures, processes for better performance and support integration testing.
  • Developed ETL pipelines in and out of data warehouse using combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake.
  • Involved in data analysis and handling the ad-hoc requests by interacting with business analysts, clients and resolve the issues as part of production support.
  • Performed debugging by checking the errors in the mapping using Debugger utility of the Designer tool and made appropriate changes to generate the required results.
  • Liaison with cross functional teams across the enterprise, including OLTP development team and various data-warehouse teams (on site and offshore team members)
  • Implemented Error Handling Strategy in ETL mappings and route the problematic records to exception tables.
  • Engaged in creating UNIX Shell scripts to invoke workflows and used PL/SQL to create the dynamic pmcmd commands and parameter files for the workflows.
  • Responsible for writing Autosys JILs and scheduling the Informatica workflows on the Autosys server.
  • Prepared Test Plan and Test strategy from the Business Requirements and Functional Specification for the integrations.
  • Developed Test Cases for Deployment Verification,ETLData Validation, and application testing.
  • Worked asETLTester responsible for the requirements /ETLAnalysis,ETLTesting and designing of the flow and the logic for theData warehouseproject.
  • Followed waterfall and AGILE development methodology
  • Functioned as the Onsite / Offshore coordinator for a team.
  • Experienced in writing complexSQL queriesfor extracting data from multiple tables.
  • Created custom views to improve performance of thePL/SQLprocedures.
  • Testing TEMPhas been done based on Change Requests and Defect Requests.
  • Preparation of System Test Results after Test case execution.
  • Worked on Unix based File System good in log monitoring, analyzing, and providing remediation steps.
  • Worked with informatica support for fixing Informatica Linux server issues.
  • Worked on moving S3 folders and buckets to cloud using Python in Lambda. Hands on python development

Environment: Informatica BDM 10.4.0, ETL, Attunity, Shell, SQL Server, DB2, Oracle, Salesforce, AS400, AWS S3, Teradata, Snowflake, Aurora, Hadoop 2.9, Informatica Administrator console, informatica analyst, Postgres SQL, Hive, Linux, shell, Python 3.6, Informatica Cloud.

Confidential, Long Island

ETL Developer

Responsibilities:

  • Involved in gathering requirement from business customers and developed requirement document and ETL specification.
  • Deployed Talend jobs into server and schedule through Talend Job Conductor tool. Worked independently to develop ETL Talend Job’s.
  • Developed ETL Talend complex job is by using components such as tMap, tFilterRow, tJavaRow, etc
  • Created complex Stored Procedures, Triggers, Functions, Indexes, Tables, Views and other T-SQL code and SQL joins for applications following SQL code standards.
  • Extensively involved in performance tuning of the Informatica ETL mappings by using the caches and overriding the SQL queries and by using Parameter files.
  • Using Talend extracted the information from various disparate data sources including but not limited to Oracle, Netezza, MySQL, Mongo, Hadoop, MSSQL, and flat files and loaded into destination.
  • Experience with big data platform (me.e., Hadoop) big data tools such as Apache Spark.
  • Proficiency in SQL, big data technologies and working with large data sets.
  • Ability to translate business attributes (GUI Labels, Reporting Attributes) into Data Model elements
  • Used reverse engineering techniques to find the source tables and fields for modifications.
  • Developed confidential proprietary analytical tools and reports with Confidential Excel, and Power Pivot and Power Point.
  • Provide support and maintenance on ETL processes and documentation.
  • Responsible for all phases of solution delivery from initial requirements analysis, solution estimation, coordination of development, testing and delivery to production.
  • Should possess hands on experience in Design, Development and CI/CD release.
  • Experienced in developing transformations with different file formats like Positional, JSON, XML, CSV, Flat file
  • Experienced in working with Database components in Talend like Oracle, MySQL etc.
  • Performed Index analysis for tables and came up with more efficient solutions to useClusteredandNon-Clustered Indexesfor significant performance boost using index tuning wizard.
  • Filtered bad data usingDerived column, Lookups, Fuzzy Lookups,Conditional split.
  • Configure and maintain usingReport ManagerandReport ServerforSSRS,DeployedandScheduledthe Reports inReport Manager
  • Created Reports usingCharts,Gauges,Tables,matrix. CreatedParameterized Report,Dashboard Report,linked reportandSub Report by Year, Quarter, Month, and Week
  • CreatedDrill Down Reports, Drill Through Reportby Region.

Environment: ETL, SSRS, SSAS, Informatica Power Center 10.1/9.6.1/9.5.1 , Oracle E-Business Suite (EBS), Talend 6.2, Netezza 3.0, Hadoop work bench, putty 0.64, Teradata14.0, Kafka, SQL Server 2014, Netezza, UNIX, Toad, PL/SQL, DB2, Tableau 10.2.

Confidential, Milwaukee, WI

ETLDeveloper/Teradata Developer

Responsibilities:

  • Worked in Agile development methodology environment and Interacted with the users, Business Analysts for collecting, understanding the business requirements.
  • Worked on building the ETL architecture and Source to Target mapping to load data into Data warehouse.
  • Involved in the installation and configuration of Informatica Power Center 10.1 and evaluated Partition concepts in Power Center 10.1
  • Analyzing the method of transforming existing data into a format for the new environment and the loading of this data into other database structures.
  • Providing necessary change and support documentation.
  • Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
  • Created stored procedures, views, user defined functions and common table expressions in SQL and Hadoop.
  • Tested to verify dat all data were synchronized after the data is troubleshoot and usedSQLto verify/validate my test cases.
  • Reviewed Informatica mappings and test cases before delivering to Client.
  • Worked asETLTester responsible for the requirements /ETLAnalysis,ETLTesting and designing of the flow and the logic for theData warehouseproject.
  • Experienced in writing complexSQLqueries for extracting data from multiple tables.
  • Implemented very large-scale data intelligence solutions around Snowflake Data Warehouse.
  • A solid experience and understanding of architecting, designing and operationalization of large-scale data and analytics solutions on Snowflake Cloud Data Warehouse.
  • Reviewed ETL performance and conducts performance tuning as required on mappings/workflows or SQL.
  • Created UNIX scripts for parsing and modifying data and experience in usingAUTOSYSjob scheduler for automation ofUNIX shell scriptsandbatch scheduling.
  • Built SSIS packages involving ETL process, extracting data from various flat files, Excel files, legacy systems and loading into SQL server.
  • Manage cross-program data assurance for physical data items in source and target systems.
  • Strong SQL knowledge and working experience in Teradata Stored Procedures/BTEQ scripts.
  • Experience in creating FACT tables. Knowledge of reporting tools especially Tableau would be advantageous.
  • Extracted data from different sources like MVS data sets, Flat files (“pipe” delimited or fixed length), excel spreadsheets and Databases.
  • UsedTeradata utilities FAST LOAD, MULTI LOAD, TPUMPto load data.
  • Wrote, tested, and implemented Teradata Fast load, Multiload and BTEQ scripts, DML and DDL.
  • Managed all development and support efforts for the Data Integration/Data Warehouse team.
  • Used Informatica power center 9.6.1 to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files.
  • Developed and deployed ETL job workflow with reliable error/exception handling and rollback within the MuleSoft framework.
  • UsedBTEQandSQL Assistant(Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Provided on call support during the release of the product to low level to high level Production environment.

Environment: InformaticaDeveloper, C++, Oracle12C, AWS, Teradata14.0, SQL Server 2014, Autosys Scheduler Tool, UNIX, Toad, PL/SQL, SSIS, SSRS, T-SQL, Power Connect, DB2, Tableau10.1, Power Shell, Teradata.

Confidential, Lakeland, FL

ETLDeveloper/ SQL developer

Responsibilities:

  • Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business.
  • Designed ETL specification documents for all the projects.
  • Created Tables, Keys (Unique and Primary) and Indexes in the SQL server.
  • Extracted data from Flat files, DB2, SQL and Oracle to build an Operation Data Source. Applied business logic to load the data into Global Data Warehouse.
  • Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.
  • Maintained source and target mappings, transformation logic and processes to reflect the changing business environment over time.
  • Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop robust mappings in the Informatica Designer.
  • Worked on complex Source Qualifier queries, Pre, and Post SQL queries in the Target.
  • Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.
  • Extensively used workflow variables, mapping parameters and mapping variables.
  • Created sessions, batches for incremental load into staging tables and scheduled them to run daily.
  • Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
  • Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.
  • QA (Testing) and supported QA team and UAT (User).
  • Created detailed Unit Test Document with all possible Test cases/Scripts.
  • Conducted code reviews developed by my teammates before moving the code into QA.
  • Worked on DataStage clustering architecture and parallel jobs.
  • Provided support to develop the entire warehouse architecture and plan the ETL process. documentations along with different test cases to smooth transfer of project and to maintain SDLC.
  • Identified the bottlenecks and improved overall performance of the sessions. Experience in Scheduling Informatica sessions for automation of loads in Autosys.
  • Provided production support by monitoring the processes running daily.

Environment: Informatica Power Center 10.x/9.x, Erwin, Oracle 11g/10g, PL/SQL, SQL Loader, MS SQL Server 2012/2008, Autosys, SQL Server 2014, SSIS, T-SQL

Confidential

Informatica Administrator/Developer

Responsibilities:

  • Installed and configured Informatica Power Center 9.1.
  • Configured Load balancing and Grid in Informatica.
  • Creation and maintenance of Informatica users and privileges.
  • Created Native Groups, Native Users, roles, and privileges and assigned them to each user group.
  • Developed a standard ETL framework to enable the reusability of similar logic across the board. Involved in System Documentation of Dataflow and methodology.
  • Installation and configuration of B2B DX, MFT and troubleshoot the issues.
  • Developed mappings to extract data from SQL Server, Oracle, Flat files, and load into DataMart using the Power Center.
  • Written pre session and post session scripts in mappings.
  • Created deployment groups and assigned the permissions to the deployment group.
  • Created Informatica Folders and assigning the permissions to the Folders.
  • Created OS profiles for the user which used to run the applications.
  • Migration of Informatica Mappings/Sessions/Workflows from Dev to Test and Test to Stage and Stage to Prod environments.
  • Coordinated with UNIX and Database Administrators for creating OS profiles and file systems.
  • Handled outages when there is any maintenance in UNIX and DATA BASE.
  • Collaboratively worked with Application Support Team, Network, Database Teams, and UNIX teams.
  • Bouncing Activities or Restarting the Service in Network Changes or in any maintenance and Communicating to Business.

Environment: Informatica PowerCenter 9.1 HF4, Oracle 11.2.0.3.0, Linux and OS.

We'd love your feedback!