Sr. Etl/talend Developer Resume
Beaufort, SC
SUMMARY:
- Certified Data Stage consultant around 8 years of experience as a Technical analyst, safety lead, Developer and administrator having solid experience in Banking, Finance, Health Care, Automobile and Manufacturing domains
- Experience on Talend ETL Enterprise Edition for Big data & Data integration v5.5/6
- Extensive experience in IBM Info Sphere Data Stage ETL integration with SAP bank analyzer functional module performing data extract, loading and transformation process for financial general ledger accounts data.
- Experience in Hadoop Big Data Integration with Data stage ETL on performing data extract, loading and transformation process for automobile ERP data.
- Experience on Balance optimization for Data stage jobs and HQL conversion's. Experience on writing Hive queries to load the data into HDFS. Knowledge on IBM Info sphere MDM Member model configuration and MDM data model
- Experience in UNIX file/dataset management to keep the load ready data for all financial transactional data.
- Experience in Oracle to perform the product hierarchy with master data & Active based costing reference validations for vertical financial modules.
- Experience in Teradata to perform the profit analytical model (PAM) data validations.
- Experience on Agile process implementation as a scrum team member. Experience in functional document & HLD preparation, business & data analysis on financial system road map for banking products.
- Process improvement by performing the Quality Champion role for quality code deployment in every month production cycle.
- Data processing for emphasizing on analysis, development, design, testing and implementation of various projects for Pharmaceutical, Material Management, Banking & financial industries.
- Involved in complete Software Delivery Life Cycle (SDLC) in large data warehouse environment for financial data system.
- Thorough knowledge in data warehousing, dimensional modeling, data integration, data virtualization, data synchronization, star schema, snowflake schema, ETL development & performance tuning, BI data analysis, SAP integrating, DFS & HDFS cluster segregation.
- Thorough domain knowledge in Banking & financial system and involved in profitability financial structure for varies banking products like Loans, Cards, Investments & Deposits for all instruments.
- Key role performance on Credit & small business cards to load the defaulting accounts into the financial module.
- Functional analysis on commercial loans and consumer loans for MDM hierarchy data validations and loads into financial system functional modules
- Global template hierarchy checks with financial products for master data like fund transfer pricing (FTP), Business partners & Active based costing and Risk data analysis system on capital funding.
- Thorough domain knowledge in pharma and involved in all four Phases of Clinical trials.
- Worked with senior Bio - statisticians and clinical data managers to provide ETL programs in analyzing the data, generating safety and efficacy loads, summary tables.
- Excellent oral and written communication skills. A quick learner with an eye for detail and excellent analytical and quantitative skills.
- Ability to work independently and adept at managing multiple competing tasks.
PROFESSIONAL EXPERIENCE:
Sr. ETL/Talend Developer
Confidential, Beaufort, SC
Responsibilities:
- Developed complex ETL mappings for Stage, Dimensions, Facts and Data marts load
- Involved in Data Extraction for various Databases & Files using Talend. Created Talend jobs using the dynamic schema feature.
- Have used Big Data components (Hive components) for extracting data from hive sources.
- Performance tuning - Using the tmap cache properties, Multi-threading and tParallelize components for better performance in case of huge source data. Tuning the SQL source queries to restrict unwanted data in ETL process.
- ELT components - Pushdown optimization technique. Moving the transformation logic to the database side instead of handling at talend side. Database tables are indexed properly and data is huge then ELT method can provide to be much better option in terms of performance of the Job
- Have used AWS components (Amazon Web Services) - Downloading and uploading data files (with ETL) to AWS system using S3 components.
- Used more components in Talend and Few to be mentioned: tjava, toracle, txmlMap, tdelimited files, tlogrow, tlogback components etc. in many of my Jobs Design
- Worked on Joblets (reusable code) & Java routines in Talend
- Implemented Talend POC to Extract data from Salesforce API as an XML Object & .csv files and load data into SQL Server Database
- Implemented Error Logging, Error Recovery, and Performance Enhancement's & created Audit Process (generic) for various Application teams.
- Experience in using Repository Manager for Migration of Source code from Lower to higher environments.
- Created Projects in TAC and Assign appropriate roles to Developers and integrated SVN (Subversion)
- Worked on Custom Component Design and used to have embedded in Talend Studio
- Used to be on call Support if the Project is deployed to further Phases. Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis (Cron Trigger)
Environment: Talend Data Integration 6.1/5.5.1, Talend Enterprise Big Data Edition 5.5.1, Talend Administrator Console, Oracle 11g, Hive, HDFS, Sqoop, Netezza, SQL Navigator.
ETL/Talend Developer
Confidential, Columbia, SC
Responsibilities:
- Participated in JAD sessions with business users and SME's for better understanding of the reporting requirements.
- Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
- Analyzing the source data to know the quality of data by using Talend Data Quality.
- Broad design, development and testing experience with Talend Integration Suite and knowledge in Performance Tuning of mappings.
- Developed jobs in Talend Enterprise edition from stage to source, intermediate, conversion and target.
- Involved in writing SQL Queries and used Joins to access data from Oracle, and MySQL.
- Used tStatsCatcher, tDie, tLogRow to create a generic joblet to store processing stats.
- Solid experience in implementing complex business rules by creating re-usable transformations and robust mappings using Talend transformations like tConvertType, tSortRow, tReplace, tAggregateRow, tUnite etc.
- Developed Talend jobs to populate the claims data to data warehouse - star schema.
- Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
- Used tStatsCatcher, tDie, tLogRow to create a generic joblet to store processing stats into a Database table to record job history.
- Integrated java code inside Talend studio by using components like tJavaRow, tJava, tJavaFlex and Routines.
- Experienced in using debug mode of talend to debug a job to fix errors. Created complex mappings using tHashOutput, tHashInput, tNormalize, tDenormalize, tMap, tUniqueRow. tPivot To Columns Delimited, etc.
- Used tRunJob component to run child job from a parent job and to pass parameters from parent to child job.
- Created Context Variables and Groups to run Talend jobs against different environments.
- Used tParalleize component and multi thread execution option to run subjobs in parallel which increases the performance of a job.
- Implemented FTP operations using Talend Studio to transfer files in between network folders as well as to FTP server using components like tFileCopy, TFileAcrchive, tFileDelete, tCreate TemporaryFile, tFTPDelete, tFTPCopy, tFTPRename, tFTPut, tFTPGet etc.
- Experienced in Building a Talend job outside of a Talend studio as well as on TAC server.
- Experienced in writing expressions with in tmap as per the business need. Handled insert and update Strategy using tmap. Used ETL methodologies and best practices to create Talend ETL jobs.
- Extracted data from flat files/ databases applied business logic to load them in the staging database as well as flat files.
Environment: Talend 5.5/5.0, Oracle 11g, Teradata SQL Assistant, HDFS, MS SQL Server 2012/2008, PL/SQL, Agile Methodology, Informatica, TOAD, ERwin, AIX, Shell Scripts, AutoSys, SVN.
Sr. ETL/Talend Developer
Confidential, Rock Hill, SC
Responsibilities:
- Worked closely with Business Analysts to review the business specifications of the project and also to gatherthe ETL requirements.
- Developed jobs, components and Joblets in Talend. Designed ETL Jobs/Packages using Talend Integration Suite (TIS)
- Created complex mappings in Talend using tHash, tDenormalize, tMap, tUniqueRow. tPivot To Columns Delimited as well as custom component such as tUnpivotRow.
- Used tStatsCatcher, tDie, tLogRow to create a generic joblet to store processing stats into a Database tableto record job history.
- Created Talend Mappings to populate the data into dimensions and fact tables.
- Frequently used Talend Administrative Console (TAC). Implemented new users, projects, tasks within multiple different environments of TAC (Dev, Test, Prod, and DR).
- Developed complex Talend ETL jobs to migrate the data from flat files to database.
- Implemented custom error handling in Talend jobs and also worked on different methods of logging.
- Created ETL/Talend jobs both design and code to process data to target databases.
- Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote fewJava code to capture global map variables and use them in the job.
- Successfully Loaded Data into different targets from various source systems like Oracle Database, DB2, Flatfiles, XML files etc into the Staging table and then to the target database.
- Troubleshot long running jobs and fixing the issues.
- Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer ofproject from development to testing environment and then to production environment.
- Performed Unit testing and System testing to validate data loads in the target.
Environment: Talend Open Studio 5.0.1, Informatica Power center, UNIX, Oracle, SQL Server, TOAD, AutoSys.
Senior Informatica Specialist
Confidential, Columbia, SC
Responsibilities:
- Developed high level technical design specification and low level specifications based on the business requirements.
- Extensively used Informatica client tools (Source Analyzer, Warehouse Designer, Mapping Designer and Workflow Manager).
- Used Informatica Designer for developing mappings, using transformations, which includes aggregate, Update, lookup, Expression, Filter, Sequence Generator, Router, and Joiner etc.
- Created reusable transformations and mapplets and used them in mappings to reduce redundancy in coding.
- Extensively used Informatica Power Exchange Change Data Capture (CDC) for creation of Data Maps using Mainframe Tables.
- Coded number of batch and online programs using COBOL-DB2-JCL. Designed Complex mappings, Used Lookup (connected and unconnected), Update strategy and filter transformations for loading historical data.
- Extensively used SQL commands in workflows prior to extracting the data in the ETL tool.
- Implemented different tasks in workflows which included Sessions, Command Task, Decision Task, Timer, Assignment, Event-Wait, Event-Raise Control, E-Mail etc.
- Used Debugger to test the data flow and fix the mappings. Involved in Performance tuning of the mappings to improve the performance.
- Performed Unit Testing and prepared unit testing documentation. Developed the Test Cases and Test Procedures.
- Extensive use of IDQ for data profiling and quality. Built a Unix Script which checks the Mapping, Session and Workflow names by identifying the Power Center folders, builds an XML and then it zips (tar) all documents based on the names given and deploy them across the environments.
- Scheduled Jobs and box jobs in AutoSys and analyzed the Run status of both jobs and box jobs in DB2 Environment.
Environment: Informatica Power Center 9.0/8.6, Informatica Power Exchange CDC, DB2 Mainframe, AutoSys, Toad, Windows XP, UNIX.
Senior Informatica Developer
Confidential, Hilton Head, SC
Responsibilities:
- Present the IT solutions implementation approach as dictated by the Business. equirements Documents requirements classification and methodology
- Present the risks, dependencies, and outstanding items that require attention from the project stakeholders. Responsible for designing, developing and unit testing of the mappings.
- Developed mappings using Informatica PowerCenter Designer to load data from various source systems to target database as per the business rules.
- Used various transformations like Source Qualifier, Aggregators, Connected & unconnected lookups, Filters, Sequence generators, Routers, Update Strategy, Expression, etc.
- Involved in developing test cases for the Informatica mappings and update processes.
- Responsible for monitoring all the sessions that are running, scheduled, completed and failed. Debugged the mapping of the failed session.
- Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and maintainability. Performed the unit, system and integration testing for the jobs.
- Validated the test results by executing the queries using the Toad software.
- Prepared test plans for both unit and system tests. Responsible to design, develop and unit test the mappings.
- Developed re-usable transformations, re-usable mapplets. Wrote design documentation for the ETL process and Informatica mappings.
- Unit tested the mappings by running SQL queries and comparing the data in source and target databases.
- Worked with source teams to resolve data quality issues raised by end users. Created TIDAL jobs and schedules based on demand, run on time (Daily, Weekly, Monthly), run only once and Ad-Hoc.
- Involved in writing Pre-session and Post-session korn shell scripts for dropping & creating indexes for tables, Created shell scripts which will substitute all the user and schema information in the SQL queries.
Environment: Informatica Power Center 8.6, Oracle 10g, SQL Server 2008, Business Objects XI R2, TIDAL
Informatica Developer
Confidential, Beaufort, SC
Responsibilities:
- Worked on Informatica - Repository Manager, Designer, Workflow Manager & Workflow Monitor.
- Integrated data into CDW by sourcing it from different sources like SQL, Flat Files and Mainframes (DB2) using Power Exchange.
- Extensively worked on integrating data from Mainframes to Informatica Power Exchange.
- Extensively worked on Informatica tools such as Source Analyzer, Data Warehouse Designer, Transformation Designer, Mapplet Designer and Mapping Designer to designed, developed and tested complex mappings and Mapplets to load data from external flat files and RDBMS.
- Used output xml files, to remove empty delta files and to FTP the output xml files to different server.
- Worked with the Business Analyst team during the functional design and technical design phases. Designed the mappings between sources (external files and databases) to operational staging targets.
- Extensively used various transformations like Source Qualifier, Joiner, Aggregators, and Connected and Unconnected lookups, Filters, Router, Expression, Rank Union, Normalizer, XML Transformations and Update Strategy & Sequence Generator.
- Used XML transformation to load the data XML file. Worked on Informatica Schedulers to schedule the workflows. Extensively worked with Target XSD's in order to generate the output xml files.
- Created mappings to read parameterized data from tables to create parameter files. Good Experience in Co-Coordinating with Offshore.
Environment: Informatica Power Center 8.6.1, Power Exchange 8.6.1, Windows, IBM DB2 8.x, Mainframes, SQL Server 2008, Erwin.
SQL Developer
Confidential, Florence, SC
Responsibilities:
- Analyzed reports and fixed bugs in stored procedures using SSRS. Used complex expressions to group data, filter and parameterize reports.
- Created linked reports and managed snapshots using SSRS. Performed various calculations using complex expressions in the reports and created report models.
- Generated Complex SSRS reports like reports using Cascading parameters, Snapshot reports Drill-down Reports, Drill-Through Reports, Parameterized Reports and Report Models and ad hoc reports using SSRS based on Business Requirement Document.
- Conducted performance tuning of complex SQL queries and stored procedures by using SQL Profiler and index tuning wizard.
- Provided Production support to analyze and fix the problems and errors on daily basis by modifying SSIS Packages and Stored Procedure if necessary.
- Designed and developed Tables, Stored procedures, Triggers and SQL scripts using TSQL, Perl and Shell scripting for enhancements and maintenance of various database modules
Environment: MS SQL Server 2005/2008, SSRS, SSIS, SSAS, T-SQL, Erwin, SQL Explorer.