We provide IT Staff Augmentation Services!

Sr.etl Developer Resume

2.00/5 (Submit Your Rating)

Gardner, KS

SUMMARY

  • Around 8 years of total IT experience in software analysis, design and development for various software applications in client - server environment with expertise in providing ETL, Business Intelligence solutions in Data Warehousing for Decision Support Systems.
  • Excellent experience in ETL processing using Informatica Power Center 9.x/8.x/ 7.x/, Power Connect, Power Exchange, Informatica DVO & Informatica TDM.
  • Experience in Informatica B2B Data Exchange using Unstructured, Structured Data sets.
  • Good eexperience working on databases such Oracle, Teradata, Netezza & SQL server.
  • Expertise in using Informatica client tools - Designer, Repository Manager, Repository Server Administration Console,Workflow Manager and Workflow Monitor
  • Expertise in testing, debugging, validation and tuning of mappings, sessions and workflows in Informatica
  • In-depth understanding of Data warehousing concepts such asBill Inman, Kimball methodologies.
  • Excellent experience working on Netezza and utilizing nzload, nzsql scripts for loading various set of data and utilizing ELT approach
  • Experience in BigData Analytics using Cassandra, MongoDB, MapReduce and relational databases.
  • Hands on experience in installing, configuring, and using Hadoop ecosystem components like Hadoop MapReduce, HDFS, HBase, Zookeeper, Oozie, Hive, Sqoop, Pig, and Flume.
  • Experience in analyzing data using HiveQL, Pig Latin, HBase and custom MapReduce programs in Java. Extending Hive and Pig core functionality by writing custom UDFs.
  • Excellent experience with Big data on Informatica and Talend ETL tool.
  • Extensive experience in Data warehouses with strong understanding of Logical, Physical, and Dimensional Data Modeling to Design Star and Snow flake Schema’s.
  • Excellent experience using Teradata Utilities such as Fast Load, Multi Load, Fast Export, Tpump and TPT as per the business needs
  • Experienced in Capturing Incremental data (CDC) from source systems.
  • Proficient in using Workflow Manager Tools like Task Developer, Workflow Designer and Worklet Designer
  • Hands on experience in tuning mappings, identifying and resolving performance bottlenecks Confidential various levels like sources, targets, mappings, and sessions.
  • Involved in Data Warehouse QA Process. Good experience in Analyzing and Validating Large Volumes of Data from different systems.
  • Experienced in developing applications in Oracle and writing Stored Procedures, Triggers.
  • Good in writing shell scripts in UNIX to automate the process of loading and pulling data from different servers.
  • Experience in Unix Shell scripting, scheduling cron jobs and also Job Scheduling on multiple platforms like Windows, Unix, Linux
  • Excellent Team player and can work on both development and maintenance phases of the project
  • Excellent analytical, programming, written and verbal communication skills with ability to interact with individuals Confidential all levels
  • Quick learner and excellent team player, ability to meet tight deadlines and work under pressure

TECHNICAL SKILLS

Programming Languages: C, C++, JAVA, SQL, PL/SQL, Shell Scripting.

ETL Tools: Informatica Power Center 9.x/8.x, 7.x/6.x, Power Exchange, Power Connect, Informatica TDM, DVO, Talend 5.1/4x

RDBMS: Oracle 11g/10g/9i/8i/8.0, (SQL, PL/SQL, Stored Procedures, Functions), SQLSERVER 2008 R2/2005/2000, Teradata, Netezza

Data Modeling: Erwin, Power Designer

BI (Reporting)Tools: Business Objects XI R4/R3/R2

Data base Tools: Toad, SQL Loader

Operating Systems: Windows 2000/NT/XP, Unix, Red Hat Linux, Main Frames

Scripting: Unix Shell Scriptingx

Other: HTML, XML, Autosys, Live Office

PROFESSIONAL EXPERIENCE

Confidential, Gardner, KS

Sr.ETL Developer

Responsibilities:

  • Coordinated with Business Users to understand business needs and implement the same into a functional Data warehouse design.
  • Converted functional specifications into technical specifications.
  • Developed complex jobs to load data from multiple source systems like Oracle 10g, flat files, XML files to data mart in Oracle database.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite.
  • Involved in Unix Shell Scripts for automation of ETL process.
  • Created Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and also development and production environment structures.
  • Responsible for creating fact, lookup, dimension, staging tables and other database objects like views, stored procedure, function, indexes, constraints
  • Implemented Error handling in Talend to validate the data Integrity and data completeness for the data from the Flat File.
  • Worked on slowly changing dimensions type 1 and type 2 for populating dimension tables as per mapping rules specifies.
  • Worked on Google Analytics, by populating data using tgoogleanalytics component in order to implement dashboards for business needs.
  • Extracted data from Informix database using talend and loaded into SQL Server Database tables.
  • Worked on talend by using Business Intelligence folder related components like tjasperoutput and tjasperoutputexec for passing the data in order to build reports through Jaspersoft.
  • Used Jaspersoft IReport Designer for generating order summary weekly reports.
  • Scheduling of the reports using Jasper Server.
  • Participated in weekly end user meetings to discuss data quality, performance issues. Ways to improve data accuracy and new requirements etc.
  • Involved in migrating objects from DEV to QA and testing them and then promoting to Production.
  • Provided production Support by running the jobs and fixing the bugs.
  • Monitor; troubleshoot batches and jobs for weekly and monthly extracts from various data sources across all platforms to the target database.
  • Worked on Talend Administration Console (TAC) for scheduling jobs and adding users.

Environment: Talend Integration open studio 5.5, Talend Enterprise Edition Integration Suite 5.3, Oracle, Informix, SQL Server, XML Files, CSV Files, Tivoli, Windows-XP (Client), LINUX, Toad, Oracle SQL Developer, SSH (Secure Shell), Jaspersoft IReport Designer, Talend administration center(TAC)

Confidential, Lincoln, Al

Sr. Informatica, Talend Developer

Responsibilities:

  • Designed and developed Informatica mappings to build business rules to load data.
  • Extensively worked on Informatica Lookup, Aggregator, Expression, Stored Procedure and Update Transformations to implement complex rules and business logic.
  • Developed complex Talend ETL jobs to migrate the data from flat files to database.
  • Implemented custom error handling in Talend jobs and also worked on different methods of logging.
  • Created ETL/Talend jobs both design and code to process data to target databases.
  • Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote few Java code to capture globalmap variables and use them in the job.
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.
  • Implemented Slowly Changing Dimensions, for the data which were used to analyze the data.
  • Created Implicit, local and global Context variables in the job.
  • Responsible for creating fact, lookup, dimension, staging tables and other database objects like views, stored procedure, function, indexes, constraints
  • Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.
  • Wrote complex SQL queries to take data from various sources and integrated it with Talend.
  • Worked on Talend Administration Console (TAC) for scheduling jobs and adding users.
  • Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.
  • Implemented Error handling in Talend to validate the data Integrity and data completeness for the data from the Flat File
  • Created Unix Scripts and run them using tSSH and tSystem for reading the Data from flat files and archiving the Flat files Confidential the specified server.
  • Used Informatica power center for (ETL) extraction, transformation and loading data from heterogeneous source systems.
  • Involved in importing Source/Target Tables from the respective databases and created Reusable Transformations (Joiner, Routers, Lookups, Rank, Filter, Expression, and Aggregator).
  • Mapplets and Mappings using Designer module of Informatica.
  • Analyzed and created Facts, Dimension tables.
  • Created Stored Procedures for data transformation purpose.
  • Generated PL/SQL and Shell scripts for scheduling periodic load processes.
  • Extensively worked on the Database Triggers, Functions and Database Constraints.
  • Worked on External Loader to populate the target Oracle.
  • Performed Unit & regression testing for the Application

Environment: Informatica Power Center 9.x/8.x/ Power Exchange, Talend, Erwin, Oracle 11, Teradata, DB2, UNIX Shell Scripting, SQL, PL/SQL, Cognos, Toad, UNIX, Windows

Confidential, Colorado Springs, CO

Sr. Informatica Developer

Responsibilities:

  • Worked with Business Analysts (BA) to analyze the Data Quality issue and find the root cause for the problem with the proper solution to fix the issue.
  • Analysis of the downstream flow on any changes to the Enterprise Data warehouse.
  • Document the process that resolves the issue which involves analysis, design, construction and testing for Data quality issues
  • Involved in doing the Data model changes and other changes in the Transformation logic in the existing Mappings according to the Business requirements for the Incremental Fixes
  • Designed ETL processes and developed source to target data mappings
  • Developed, tested, and deployed ETL routines using ETL tools and external programming/scripting languages, as necessary
  • Experience scheduling jobs using Control-M a plus
  • Worked extensively with Informatica tools like Source Analyzer, Warehouse Designer, Transformation Developer and Mapping Designer.
  • Extracted data from various heterogeneous sources like DB2, Mainframes and Flat Files using Informatica Power center and loaded data in target database.
  • Extensively worked with different transformations such as Expression, Aggregator, Sorter, Joiner, Router, Filter and Union in developing the mappings to migrate the data from source to target.Used connected and Unconnected Lookup transformations and Lookup Caches in looking the data from relational and Flat Files. Used Update Strategy transformation extensively with DD INSERT, DD UPDATE, DD DELETE, and DD REJECT.
  • Worked with External stored procedures for data cleansing purpose
  • Extensively Implemented SCD TYPE 2 Mappings for CDC (Change data capture) in EDW.
  • Code walkthrough and Review of documents which are prepared by other team members.
  • Involved in doing Unit Testing, Integration Testing and Data Validation.
  • Migration of Informatica maps extensively involved in migration from Dev to SIT and UAT to Production environment.
  • Worked in developing the Downstream Data marts like CDM, IDM and BDR which are used for the Reporting. Data is extracted from the EDW and loaded in to these Data marts according to the Business requirements.
  • Created and ran Pre-existing and debug sessions in the Debugger to monitor and test the sessions prior to their normal run in the Workflow Manager
  • Extensively worked in migrating the mappings, worklets and workflows within the repository from one folder to another folder as well as among the different repositories.
  • Created Mapping parameters and Variables and written parameter files.
  • Implemented various Performance Tuning techniques by finding the bottle necks Confidential source, target, mapping and session and optimizing them.
  • Used UNIX Shell Scripting to invoke Sessions in the workflow manager.
  • Used SQL queries and database programming using PL/SQL (writing Packages, Stored Procedures/Functions, and Database Triggers).Used SQL tools like TOAD to run SQL queries and validate the data in warehouse.
  • Worked with the SCM code management tool to move the code to Production
  • Extensively worked with session logs and workflow logs in doing Error Handling and trouble shooting.

Environment: Windows, Sun Solaris, Informatica Power Center 8.6,Power Connect, Informatica B2B Data Exchange, Oracle 10g, DB2 UDB, PDF files, spreadsheets, Word documents, legacy formats, XML, Business Objects XI R3, Autosys, Unix Shell Scripts, Erwin, UNIX, Windows.

Confidential, Lincolnshire, IL

Informatica, Teradata Developer

Responsibilities:

  • Developed various Mappings and Transformations using Informatica Designer
  • Involved in design, development and maintenance of database for Data warehouse project
  • Designed and Customized data models for Data Mart supporting data from multiple sources on real time.
  • Used different transformations for Extraction/Transformation, data cleansing and loading data into staging areas and Presentation Tables.
  • Worked with workflow Manager and workflow monitor to schedule batches and run the workflow and monitor session logs.
  • Effectively tuned the performance of Informatica Transformations, mappings, sessions and workflows.
  • Worked on Data Extraction, Data Transformations, Data Loading, Data Conversions and Data Analysis.
  • Extensively designed Data mapping using filters, Expression, Update Strategy Transformations in Power Center Designer.
  • Created target load order group mappings and scheduled them for Daily Loads.
  • Extensively used UNIX commands within Informatica for Pre Session and Post Session Data Loading Process.
  • Extensive use of Informatica metadata manager for data lineage and where-used analysis, metadata browsing, metadata reporting & metadata document.
  • Used Mapping designer and Mapplets to generate different mappings for different loads
  • Used the PL/SQL procedures for data extraction, transformation and loading
  • Performance Tuning of sources, targets, mappings and SQL queries in transformations
  • Used SQL Loader for loading data into table structures
  • Created various transformations like Joiner, Aggregate, Expression, Filter, update Strategy
  • Extensive system study, design, development and testing were carried out in the Oracle environment to meet the customer requirements.
  • Performed various SQL queries for business analysts.
  • Analyzed the data & build data warehouses by using SQL, PL/SQL, and SAS.

Environment: Informatica, Oracle 9i, Teradata, SQL Assistant, MLOAD, FASTLOAD, TUMP, FASTLOAD, PL/SQL, SQL Loader, Windows, UNIX

Confidential, Middletown, NJ

Informatica, Teradata Developer

Responsibilities:

  • Involved in identifying dimensions and facts to create enterprise data warehouse and model.
  • Involved in Designing a Star-Schema based warehouse after understanding the business logic.
  • Developed various medium and complex mappings for Loading Data in to Target tables Using Informatica.
  • Implemented Complex logic, Error Handling logic
  • Designed and developed Informatica Mappings, Reusable Sessions, Worklets, Workflows, Dynamic Parameter files
  • Designed and implemented Audit control and Exception control strategies
  • Provided solutions for various performance bottle necks in Informatica Mappings
  • Designed and Developed Autosys jil scripts to Schedule Informatica workflows
  • Developed various shell scripts using Korn Shell to Integrate various components
  • Created test cases and pseudo test data to verify accuracy and completeness of ETL process
  • Involved in Unit testing, System Integration testing, User Acceptance Testing
  • Source systems data from the distributed environment was extracted, transformed and loaded into the Data warehouse database using Informatica.
  • Extracted Data from flat files and various relational databases to Teradata & Oracle Data warehouse Database.
  • Created mappings using various Transformations like Aggregator, Expression, Filter, Router, Joiner, Lookup, Update strategy, Source Qualifier, Sequence generator, Stored Procedure and Normalizer.
  • Developed various Mappings & Mapplets to load data from various sources using different Transformations.
  • Used MLOAD, TPUMP, FASTLOAD to load data to Teradata across various areas such as staging, EDW and data marts.
  • Utilized Push down optimization in Informatica
  • Wrote many BTEQ scripts on Teradata and integrated them with Unix scripting.
  • Implemented Performance tuning in Mapping by identifying the bottlenecks and Implemented effective transformation Logic.
  • Performed various update strategies using Lookup and Update Strategy transformations.
  • Creating Stored Procedures to populate sample data and carrying out test load.
  • Loaded data from flat files and XML files to temporary tables in Oracle database using SQL*Loader.

Environment: Oracle 9i, Informatica Power Center, UNIX/AIX, Windows 2000, SQL*Loader, Teradata, SQL Assistant, MLOAD, FASTLOAD, TUMP, FASTLOAD, Erwin 4.1, Cognos 7.0, Toad. SQL, PL/SQL

We'd love your feedback!