- Over 8+ years of IT Experience in Analysis, Design, Development, Implementation and Support of Relational Database(OLTP), Data Warehousing Systems(OLAP) and Data Marts in various domains.
- 6+ years of Experience in Data Warehousing to Extract, Transform and Load (ETL) using Talend and Informatica.
- 3+ years of development & 1+ years of experience of admin and good knowledge on Talend Administrator Center/Job Scheduling.
- Well versed with Relational and Dimensional Modelling techniques like Star Schema, Snowflake Schema, Fact and Dimensional Tables.
- Extensive experience in integration of various heterogeneous data sources definitions like SQL Server, Oracle, Flat Files, Excel files loaded data in to Data ware house and Data marts using Talend Studio.
- Working knowledge on databases like MySQL, Oracle using RDS of AWS.
- Experienced in ETL TALENDData Fabric components and used features of Context Variables, MySQL, Oracle, Hive Database components.
- Experience in scheduling tools Autosys, Control M & Job Conductor (Talend Admin Console).
- Extensively created mappings in TALEND using tMap, tJoin, tReplicate, tConvertType, tFlowMeter, tLogCatcher, tNormalize, tDenormalize, tJava, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc.
- Excellent understanding and knowledge of NOSQL databases like HBase and Cassandra.
- Excellent understanding of Hadoop architecture, Hadoop Distributed File System and API's.
- Extensive knowledge of business process and functioning of Heath Care, Manufacturing, Mortgage, Financial, Retail and Insurance sectors.
- Strong skills in SQL and PL/SQL, backend programming, creating database objects like Stored Procedures, Functions, Cursors, Triggers, and Packages.
- Experience in AWSS3, EC2, SNS, SQS setup, Lambda, RDS (MySQL) and Redshift cluster configuration.
- Experienced in Waterfall, Agile/Scrum Development.
- Extensively use ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from wide variety of source systems like SQL Server, Oracle, DB2 and non - relational sources like XML, flat files, and mainframe Files.
- Expertise in writing UNIX Shell Scripts and Hands on experience on most of UNIX Commands.
- Written shell scripts using UNIX commands to send an email and to run the batch files.
- Hand on Experience in running Hadoop streaming jobs to process terabytes of xml format data using Flume and Kafka.
- Worked in designing and developing the Logical and physical model using Data modeling tool (ERWIN).
- Experienced in Code Migration, Version control, scheduling tools, Auditing, shared folders and Data Cleansing in various ETL tools.
- Good communication and interpersonal skills, ability to learn quickly, with good analytical reasoning and adaptive to new and challenging technological environment.
- Strong Team working spirit, relationship management and presentation skills.
ETL Tools: Talend 5.x/6.x, Informatica
RDBMS: Oracle 11g/10g, MS SQL Server 2005/07 and MS Access
Programming Languages: SQL, PL/SQL, Unix Shell Scripting, XML
Database Tools: TOAD, SQL Developer, SQL *Loader, SQL *Plus
Confidential, St. Louis, MO
- Interacted with business team to understand business needs and to gather requirements.
- Designed target tables as per the requirement from the reporting team and also designed Extraction, Transformation and Loading (ETL) using Talend.
- Designing, building, installing, configuring and supporting Hadoop.
- Created Technical Design Document from Source to stage and Stage to target mapping.
- Worked with Talend Studio (Development area) & Admin Console (Admin area)
- Created Java Routines, Reusable transformations, Joblets using Talend as an ETL Tool
- Created Complex Jobs and used transformations like tMap, tOracle (Components), tLogCatcher, tStatCatcher, tFlowmeterCatcher, File Delimited components and Error handling components (tWarn, tDie)
- Developed simple to complex Map Reduce jobs using Hive and Pig for analyzing the data.
- Assisted in migrating the existing data center into the AWS environment.
- Written Hive queries for dataanalysis and to process the datafor visualization.
- Developed Unix Shell Scripts to process the files on daily basis like renaming the file, extracting date from the file, unzipping the file and remove the junk characters from the file before loading them into the base tables.
- Installation of Talend Enterprise Studio (Windows, UNIX) and configuring along with Java.
- Worked with Parallel connectors for parallel processing to improve job performance while working with bulk data sources.
- Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading.
- Created contexts to use the values throughout the process to pass from parent child to child jobs and child to parent jobs.
- Worked on Joblets (reusable code) & Java routines in Talend.
- Performed Unit testing and created Unix Shell Scripts and provided on call support.
- Schedule Talend Jobs using Job Conductor (Scheduling Tool in Talend) - available in TAC.
- Retrieved data fromOracle and loaded into SQL Server data Warehouse
- Created many complex ETL jobs for data exchange and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structure.
- Created and reviewed scripts to create new tables, views, queries for new enhancement in the applications using TOAD.
- Monitoring the Data Quality, Generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL Process...
- Created UNIX shell scripts to automate data loading, extraction and to perform regular updates to database tables to keep in sync with the incoming data from other sources.
Environment: Talend Platform for Big Data 6.0.1, UNIX, Oracle 10g, Oracle, TAC (Admin Center)
Confidential, Albany, NY
- Design and Implement ETL for data load from Source to target databases and for Fact and Slowly Changing Dimensions (SCD) Type1, Type 2, Type 3 to capture the changes.
- Extensive experience in extraction of data from various sources like relational databases Oracle, SQL Server, and Flat Files.
- Involved in writing SQL Queries and used Joins to access data from Oracle, and MySQL.
- Participated in all phases of development life-cycle with extensive involvement in the definition and design meetings, functional and technical walkthroughs.
- Designing, developing and deploying end-to-end Data Integration solution.
- Implemented custom error handling in Talend jobs and worked on different methods of logging.
- Develop the ETL mappings for XML, .CSV, .TXT sources and also loading the data from these sources into relational tables with Talend ETL Developed Joblets for reusability and to improve performance.
- Created UNIX script to automate the process for long running jobs and failure jobs status reporting.
- Developed high level data dictionary of ETL data mapping and transformations from a series of complex Talend data integration jobs.
- Expertise in interaction with end-users and functional analysts to identify and develop Business Requirement Documents (BRD) and Functional Specification documents (FSD).
- Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
- Created context variables and groups to run Talend jobs against different environments.
- Used Talend components tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator.
- Performed requirement gathering, Data Analysis using Data Profiling scripts, Data Quality (DQ) scripts and unit testing of ETL jobs.
- Created triggers for a Talend job to run automatically on server.
- Set up and manage transactional log shipping, SQL server Mirroring, Fail over clustering and replication.
- Worked on AMC tables (Error Logging tables)
Environment: Talend Platform for Data management 5.6.1, UNIX Scripting, Toad, Oracle 10g, SQL Server
Confidential, Baltimore, MD
Informatica / Talend Developer
- Involved in design and development of business requirements and analyzed application requirements and provided recommended design
- Participated actively in end user meetings and collected requirements.
- Used Informatica as an ETL tool for extraction, transformation and loading (ETL) of data in the Data Warehouse.
- Designed, developed, and documented several mappings to extract the data from Flat files and Relational sources.
- Extensively worked on confirmed dimensions for the purpose of incremental loading of the target database.
- Integrated java code inside Talend studio by using components like tJavaRow, tJava, tJavaFlex and Routines.
- Used ETL methodologies and best practices to create Talend ETL jobs.
- Experienced in using debug mode of Talend to debug a job to fix errors.
- Developed re-usable transformations, mappings and Mapplets confirming to the business rules.
- Developed Talend jobs to populate the claims data to data warehouse-star schema.
- Created workflows and tested mappings and workflows in development and production environment.
- Used shell script to automate pre-session and post-session process.
- Used Debugger to test the mappings and fix the bugs and actively involved in performance improvements of mapping and sessions and fine-tuned all transformations.
- Involved in enhancements and maintenance activities of the data warehouse including performance tuning, rewriting of stored procedures for code enhancements.
- Developed new and maintaining existing Informatica mapping and workflows based on specifications.
- Performed Informatica code migration from development/ QA/ production and fixed and solved mapping and workflow problems.
- Implemented Performance tuning of existing stored procedures, functions, views & SQL Queries.
Environment: Informatica, Talend platform for Data management 5.6.1, UNIX Scripting, Toad, Oracle 10g
- Extensively used Informatica for extracting, transforming, and loading databases from sources including Oracle, DB2, and Flat files.
- Participated in User meetings, gathering requirements
- Worked with business analysts, application developers, production teams and across.
- Translating user inputs into ETL design docs.
- Participate in Design Reviews of Data model and Informatica mapping design.
- Extensively used debugger for troubleshooting issues and checking session stats/logs.
- Performed validation and testing of Informatica mapping against the pre-defined ETL design standards
- Implemented Performance tuning at database level, Reduce load time by using partitions and concurrent sessions running at a time
- Involved in writing SQL, PL/SQL, Stored Procedures, Triggers and Packages in Data
- Warehouse environments that employ Oracle.
- Developed Interfaces using UNIX Shell Scripts to automate the bulk load and Update.
- Using shell scripts to automate the export data into flat files for backup and delete data from staging tables for the given time period.
- Extensive experience in performance tuning, identifying bottlenecks and resolving to improve Performance at database level, Informatica mappings and session level.
- Involved in the optimization of SQL queries which resulted in substantial performance improvement for the conversion processes.
- Involved in migration of objects in all phases (DEV, QA and PRD) of project and trained developers to maintain system when in production
- Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating Transformations.
Environment: Informatica, MS SQL Server, Oracle,Flat files, Erwin, UNIX Shell Scripting, Toad, Autosys, Windows 2003 Server.
- .NET Developers/Architects Resumes
- Java Developers/Architects Resumes
- Informatica Developers/Architects Resumes
- Business Analyst (BA) Resumes
- Quality Assurance (QA) Resumes
- Network and Systems Administrators Resumes
- Help Desk and Support specialists Resumes
- Oracle Developers Resumes
- SAP Resumes
- Web Developer Resumes
- Datawarehousing, ETL, Informatica Resumes
- Business Intelligence, Business Object Resumes
- MainFrame Resumes
- Network Admin Resumes
- Oracle Resumes
- ORACLE DBA Resumes
- Other Resumes
- Peoplesoft Resumes
- Project Manager Resumes
- Quality Assurance Resumes
- Recruiter Resumes
- SAS Resumes
- Sharepoint Resumes
- SQL Developers Resumes
- Technical Writers Resumes
- WebSphere Resumes
- Hot Resumes