Etl Developer. Resume
Dallas, TX
PROFESSIONAL SUMMARY:
- Around 8 years of practical experience in ETL, ETL Testing, BI, data Architect and Relational Database Management Systems in Data Warehouse environments.
- Expertise in programming languages such as Java, and Python and databases programs like Hadoop and MongoDB.
- Excellent Skills in analytical analysis and proven ability to read and interpret different points of data.
- Experience in Designing, Developing Web applications and Big data applications using Java, JEE, Oracle and Cloudera Hadoop based Big data technologies
- SQL Server& Oracle PL/SQL Package writing, Query Optimization, Normalization, D - Normalization, Materialized Views, Performance Tuning, Data Densification, Data Profiling, Analytic Functions, External Tables etc.
- Extensively created mappings in Talend using tMap, tJoin, tReplicate, tParallelize, tJava, tJavarow, tDie, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap etc.
- Experience in Talend Open Studio and Talend Integration Suite.
- Expert in writing SQL queries and optimizing the queries in Oracle, SQL Server 2008,Netezzaand Teradata.
- Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, system integration and user acceptance testing.
- Extensive development and design of ETL methodology for supporting data transformations and processing in a corporate wide ETL Solution using Informatica Power Center 9.1/8.6.1, Oracle, PL/SQL, DB2, SQL Server 2008.
- Experienced in using Talend database components, File components and processing components as per the design requirements.
- Experienced in Talend Service Oriented Web Services using SOAP, REST and XML/HTTP technologies using Talend ESB components.
- Strong listening skills, allowing me to carefully consider instructions and feedback from other staff members.
- Extensive experience in using Microsoft BI studio products like SSIS,SSAS,SSRS for implementation of ETL methodology in data extraction, transformation and loading.
- Expert in Data Warehouse development starting from inception to implementation and ongoing support, strong understanding of BI application design and development principles.
- Experience in providing Logging, Error handling by using Event Handler, and Custom Logging for SSIS Packages.
- Extensive experience in using Talend features such as context variables, triggers, connectors for Database and flat files.
- Hands on Experience on many components which are there in the palette to design Jobs & used Context Variables to Parameterize Talend Jobs.
- Scheduling and Monitoring ETL Processes using DTS Exec utilities and batch files.
- Expertise in generating reports using SQL Server Reporting Services, Crystal Reports, and MS Excel spreadsheets and Power Pivot.
- Performed Data Mining activities like Predictive Analysis, Forecasting on central repository for various application and dashboard functionalities
- Expertises in creating joblets in Talend for the processes which can be used in most of the jobs in a project like to Start Job and Commit job.
- Expert in designing Enterprise reports using SQL Server Reporting Services (SSRS 2000/2005/2008) generated drill down reports, parameterized reports, linked reports, sub reports, matrix dynamics and filters, charts in SSRS 2005/2008.
- Outstanding problem-solving methods to help me design the best strategies of measuring information and reviewing the results.
- Experience In using Machine Learning Algorithms.
- Proficiency in Statistical Analysis, Quantitative Analysis, Forecasting/predictive Analytics, Multi Variate Testing and Optimization Algorithms.
- Experience on building Predictive Models, Explanatory Models, and Exploratory DataAnalysis (EDA statistically and graphically) using R-Programming/Python.
- Extensive experience using database tool such as SQL *Plus, SQL * Loader.
- Excellent skills in fine tuning the ETL mappings in Informatica.
- Experience in turning ideas into actionable designs. Able to persuade stakeholders and champion effective techniques through product development.
- Good In developing and deploying Hypotheses and analyse test results, Providing the necessary analytical rigor to ensure data quality, consistency, repeatability and accuracy of insights.
TECHNICAL SKILLS
Languages: SQL, PL/SQL, Unix Shell Script.
Operating Systems: Windows, Linux, Mac OS
Scripting languages: HTML, DHTML, Python, XML, JavaScript
Database Languages: Oracle 12c/11g10g/9i, My SQL, Teradata, IBM Netezza.
BI/ETL Tools: Informatica Power Center 9.5.1/9.1/8.x/7.1, Informatica Data Quality (IDQ), Talend Open Studio for data Integration 5.6.1, SQL Server Integration Services (SSIS), SQL Server Reporting Services (SSRS), SQL Server Analysis Services (SSAS), Business Intelligence Development Studio (BIDS), Visual.
Database Design: MS Visio, Star Schema/Snowflake Schema
Tools and Data: modeling, FACT& Dimensions tables, Physical &logical data
Modeling: Modeling and De-normalization techniques.
Packages: MS Office, Microsoft Office Suit
Tools and Utilities: SQL Server Management Studio, SQL Server Enterprise Manager, SQL Server Profiler, Import & Export Studio .Net, Microsoft Management Console, Visual Source Safe 6.0, DTS, Business Intelligence Development Studio (BIDS), Crystal Reports, Office (MS), Microsoft Excel Power Pivot, Excel Data Explorer, Wizard Studio.
PROFESSIONAL EXPERIENCE
Confidential, Dallas, TX
ETL Developer.
Responsibilities:
- Test ETL procedures for all new projects and maintain effective awareness of all production activities according to required standards and provide support to all existing applications.
- Collaborate with Metadata Manager and perform tests and provide update to all ETL activities within schedule and provide support to all large data volumes and assist in data processing.
- Documents all technical and system specifications documents for all ETL processes.
- QA and unit/unified testing of developed ETL Modules using java scripts, Python and sql codes.
- Testing using Agile, waterfall and scrum methodologies.
- Perform tests and validate all data flows and prepare all ETL processes according to business requirements and incorporate all business requirements into all design specifications.
- Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.
- Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs.
- Prepare reports for all Meta data integration into systems and draft all ETL scripts and prepare required reports for all end users.
- Perform root cause analysis on all processes and resolve all production issues and validate all data and perform routine tests on databases and provide support to all ETL applications.
- Created Talend jobs to populate the data into dimensions and fact tables.
- Experience in using Talend MDM components like tMDMBulkLoad, tMDMClose, tMDMCommit, tMDMConnection, tMDMDelete, tMDMInput, tMDMOutput, tMDMReceive, tMDMRollback, TStewardshipTaskDelete, tStewardshipTaskInput, tStewardshipTaskOutput.
- Document all test procedures for systems and processes and coordinate with business analysts and users to resolve all requirement issues and maintain quality for same.
- Monitor all business requirements and validate all designs and schedule all ETL processes and prepare documents for all data flow diagrams using Pentaho.
- Responsible for improving/maximising the ELT jobs performance by modifying the query/mappings and partitioning the sessions using performance-tuning techniques.
- Load and transform data into HDFS from large set of structured data /Oracle/Sql server using Talend Big data studio.
- Coordinate with customers and staff and provide support to all data analysis.
- Perform data analysis on all results and prepare presentations for clients.
- Perform audit on data and resolve business related issues for customer base.
- Perform data analysis and facilitate in delivery to all end users.
- Supervise all client issues and coordinate with managers and supervisors and facilitate in deliverables.
- Create Entity Relationship (ER) Diagrams to the proposed database
- Create database objects such as tables,views, stored procedures, Triggers etc.
- Maintain referential integrity, domain integrity and column integrity by using the available options such as constraints etc.
- Identify columns for Primary Keys in all the tables at the design time and create them.
- Create functions to provide custom functionality as per the requirements.
- Be aware of potential blocking, deadlocking and write code to avoid those situations.
- Participate in development and creation of Data warehouses.
- Create cubes in SQL Server Analysis Services.
Environment: Talend Data, SQL Server, SQL Server Analysis Services, DTS, Ascential Data Stage Server Edition V7.5, V8.5, SQL / PLSQL Scripting, SOAP and Restfull Webservices, Pentaho, shell Scripting, OBIEE 10g, 11g, MS Visio, ER Erwin, Toad, Putty, Microsoft Office.
Confidential, Milwaukee, WI
ETL Developer.
Responsibilities:
- Coordinate with ETL team to implement all ETL procedures for all new projects and maintain effective awareness of all production activities according to required standards and provide support to all existing applications.
- Interacting with the client for requirements gathering and analysis for the Talend Migration projects.
- Analyzed the Migration requirements and framed the business logic and implemented it using Talend.
- Development of a detailed deployment plan to convert all integration jobs to Talend
- Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
- Extensively created mappings in TALEND DI using tMap, tJoin, tReplicate, tParallelize, tConvertType, tflowtoIterate, tAggregate, tSortRow, tunique, tFlowMeter, tLogCatcher, tRowGenerator, tNormalize, tDenormalize, tSetGlobalVar, tJava, tJavarow, tAggregateRow, tWarn, tLogCatcher, tOracleOutputBulkExec, tFilter, tGlobalmap, tDie, tflowmetercatcher, tfilelist by translating the business requirements.
- Experience in creating Joblets in TALEND for the processes which can be used in most of the jobs in a project like to Start job and Commit job.
- Worked with various File components like tFileCopy, tFileCompare, tFileExist, TFileDelete, tFileRename.
- Worked on Exporting and Importing of Talend jobs.
- Responsible for develop the jobs using ESB components like tESBConsumer, tESBProviderFault, tESBProviderRequest, tESBProviderResponse, tRESTClient, tRESTRequest, tRESTResponse to get the service calls for customers DUNS numbers.
- Worked closely with business application teams, business analysts, data architects, database administrators and reporting teams to ensure the ETL solution meets business requirements.
- Created Slowly Changing Dimension (SCD) Type 2 mappings for developing the dimensions to maintain the complete historical data.
- Prepared migration document to move the mappings from development to testing and then to production repositories
- Coming up with Design plan and preparing the ETL Design document.
- Responsible in managing tasks and deadlines for the ETL teams both Onsite and Offshore.
- Organize daily technical discussions with the Onsite team also including the individual offshore work stream leads and set expectations for offshore delivery.
- Network within the team, onsite / offshore to get the necessary clarifications to execute work units assigned.
Environment: Talend Data Integration 5.6, Informatica Power Center 9.6(Repository Manager, Designer, Workflow Manager, Workflow Monitor and Repository Server Admin console), Power Exchange 9.1, Informatica developer 9.1, Oracle 12c/11g, PL/SQL, SQL, TOAD, LINUX, Shell scripts, Control-M.
Confidential, Bartlesville, OK
ETL Developer.
Responsibilities:
- Used Informatica Power Center 8.6 for migrating data from various OLTP databases and other applications to the Radar Store Data Mart
- Worked with different sources like Relational, Mainframe, XML, flat files (CSV) loaded the data into Oracle staging
- Created complex Informatica mappings with extensive use of Aggregator, Union, Filter, Router, Normalizer, Java, Joiner and Sequence generator transformations
- Created and used parameter files to perform different load processes using the same logic
- Extensively used PL/SQL for creation of stored procedures and worked with XML Targets, XSD's and DTD's
- Filtered Changed data using Power exchange CDC and loaded to the target
- Defined Target Load Order Plan and Constraint based loading for loading data appropriately into multiple Target Tables
- Used different Tasks (Session, Assignment, Command, Decision, Email, Event-Raise, Event-Wait and Control) in the creation of workflows
- Utilized the new utility Informatica Data Quality (IDQ) and Informatica Data Explorer (IDE) that came up with Informatica Version 8
- Performed performance tuning of source level, target level, mappings and session
- Involved in modifying already existing UNIX scripts and used them to automate the scheduling process
- Coordinated with testing team to make testing team understand business and transformation rules being used throughout ETL process
Environment: Informatica Exchange, Informatica Data Quality Business objects.
Confidential, Baton Rouge, LA
ETL/Informatica Developer.
Responsibilities:
- Gathered user Requirements and designed Source to Target data load specifications based on Business rules.
- Used InformaticaPowerCenter9.5 for extraction, loading and transformation (ETL) of data in the data mart.
- Designed and developed ETL Mappingsto extract data from flat files, MS Excel and Oracle to load the data into the target database.
- Developing several complex mappings in Informatica a variety of PowerCenter transformations, Mapping Parameters, Mapping Variables, Maplets& Parameter files in Mapping Designer using Informatica PowerCenter.
- Extensively used ETL processes to load data from various source systems such as DB2, SQL Server and Flat Files, XML files into target system applying business logic on transformation mapping for inserting and updating records when loaded.
- Worked with Teradata data lake
- Responsible for design and developing Teradata BTEQ scripts, MLOAD based on the given business rules and design documents.
- Experience in CodingTeradataSQL,TeradataStored Procedures, Macros and Triggers
- Handled data loading operations from flat files to tables using NZLOAD utility.
- Extensively used NZSQL and NZLOAD utilities.
- Developed UNIX Shell scripts in conjunction with NZSQL/NZLOAD utilities to load data from flat files to Netezza database.
- Loading the data into Netezza from legacy systems and flat files using complex UNIX scripts. Implemented Slow changing dimensions SCD-1 and SCD-II mappings to upgrade Slowly Changing Dimension Tables.
- Experience in migration Informatica objects from 9.1 to 9.5.
- Experience in Informatica upgrade testing, troubleshoot and resolve issues.
- Created complex mappings in the designer and monitored them. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator transformations.
- Ran the workflows on a daily and weekly basis using Active Batch Scheduling tool.
- Examined the workflow log files and assigning the ticket to the Informatica support based on the error.
- Experience in developing Unix Shell Scripts for automation of ETL process.
- Performed operational support and maintenance of ETL bug fixes and defects.
- Maintained the target database in the production and testing environments.
- Supported migration of ETL code from development to QA and QA to production environments.
- Migration of code between the Environments and maintaining the code backups.
- Designed and developed Unix Shell Scripts, FTP, sending files to source directory & managing session files
- Done extensive testing and wrote queries in SQL to ensure the loading of the data.
- Developed PL/SQL code at the database level for the new objects.
Environment: Informatica PowerCenter 9.5, Oracle11g/9i, PL/SQL, Flat files, Facets, XML, Teradata.
Confidential
ETL/Informatica Developer.
Responsibilities:
- Experience with ETL/ETI (Informatica) (Extraction Transform and Loading, Extraction Transform and Import) data from production database to data warehouse database.
- Worked on transformations such as source qualifier, joiner, lookup, rank, expression, aggregator and sequence generator etc.
- Reused mapping, Maplets and transformations.
- Scheduled sessions and batches on the Informatica Server using Informatica Server.
- Tuned transformations and mappings in Informatica.
- Implemented slowly changing dimensions.
- Created Unix Shell Scripts and PL/SQL procedures.
Environment: Informatica PowerCenter 7.1, Shell Scripts, Database Systems, SQL and PL/SQL, Windows NT, Oracle8i and UNIX
Confidential
Informatica Developer.
Responsibilities:
- Extensively used ETL (Informatica) to load the data stored in different databases sources (Oracle, SAP, Access)
- Involved in the development of Informatica Mappings using various transformations like Source Qualifier, Look-up, Filter, Expression, Normalizer, Update Strategy, Sorter, Joiner, etc.
- Worked closely with Business Analysts to find out the relationships between source systems and the business process needed in the migration process.
- Wrote session commands to configure pre-session and post-session tasks.
- Applied the complex business rules using Oracle functions and stored procedures, which are used as standalone procedures or used as pre/post load procedures in Informatica mappings.
- Created and used reusable worklets, sessions and tasks in the workflow Manager and monitored them in the workflow Monitor.
- Created Event Wait, Event Raise, Timer and Control Events in workflow manager according to the business requirement.
- Used Command task to move the parameter files to the desired position at the start of the session.
- Worked with pre-session and post-session sql and stored procedures.
Environment: Informatica Power Center 8.3, Oracle 9i, Teradata, SQL Server, Toad, SQL, PL/SQL Stored procedures, Power Exchange, Window 2000, UNIX