We provide IT Staff Augmentation Services!

Sr. Etl Talend Consultant Resume

Chicago, IL

SUMMARY

  • Around 7+ years of IT experience in Analysis, Design, Developing and Testing and Implementation of business application systems.
  • Experiences on developing & leading the end to end implementation of Big Data projects by using Talend BIGDATA, comprehensive experience as a in Hadoop Ecosystem like Map Reduce, Hadoop Distributed File System (HDFS), Hive.
  • Experience on Developing the complex jobs with webservices, REST API and SOAP API.
  • Extensive experience in ETL methodology for performing Data Profiling, Data Migration, Extraction, Transformation and Loading using Talend and designed data conversions from wide variety of source systems including Oracle, DB2, SQL server, Teradata, Hive, Hana, and flat files, XML and Mainframe files and Active MQ.
  • Good Experiences on relational database management systems, experience in integrating data from various data source like Oracle, MSSQL Server, MySQL and Flat files too.
  • Hands on experience on Hadoop technology stack (HDFS, Map - Reduce, Hive, HBase, Pig, Sqoop, Oozie, Flume and Spark).
  • Experience with NOSQL databases like HBASE and Cassandra
  • Excellent knowledge in deployment process from DEV to QA, UAT and PROD with both Deployment group and Import/Exports method.
  • Excellent working experience in Waterfall, Agile methodologies.
  • Familiar with design and implementation of the Data Warehouse life cycle and excellent knowledge on entity-relationship/multidimensional modeling (star schema, snowflake schema), Slowly Changing Dimensions (SCD Type 1, Type 2, and Type 3).
  • Debugging ETL jobs errors, ETL Sanity and production Deployment in TAC-Talend Administrator Console using SVN.
  • Experience in Trouble shooting and implementing Performance tuning at various levels such as Source, Target, Mapping Session and System in ETL Process.
  • Experience in converting the Store Procedures logic into ETL requirements
  • Good communication and interpersonal skills, ability to learn quickly, with good analytical reasoning and adaptive to new and challenging technological environment.
  • Experience in Big Data technologies like Hadoop/Map Reduce, Pig, HBASE, Hive, Sqoop, Hbase, Dynamodb, Elastic Search and Spark SQL.
  • Experienced in ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from large variety of source systems including Oracle … DB2, Netezza, SQL server, Teradata, Hive, Hana and non-relational sources like flat files, XML and Mainframe files.
  • Involved in code migrations from Dev. to QA and production and providing operational instructions for deployments.
  • Hands on Experiences on Datastage 8.7 ETL migration to Talend Studio ETL process.
  • Strong Experiences on Data Warehousing ETL experience of using Informatica 9.x/8.x/7.x Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server manager.
  • Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling(Data Vault), Effort Estimation, ETL Design, development, System testing, Implementation and production support.
  • Expertise in Using transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Store Procedure, Rank, Update Strategy, Java Transformation, Router and Sequence generator.
  • Experienced on writing Hive queries to load the data into HDFS.
  • Extensive experience on Pentaho report designer, Pentaho kettle, Pentaho BI server, BIRT report designer
  • Experienced on designing ETL process using Informatica Tool to load from Sources to Targets through data Transformations.
  • Hands on experience in developing and monitoring SSIS/SSRS Packages and outstanding knowledge of high availability SQL Server solutions, including replication.
  • Hands on experience in Deployment of DTS and SSIS packages using ETL Tool.
  • Excellent experience on designing and developing of multi-layer Web Based information systems using Web Services including Java and JSP.
  • Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using Data Vault, ERwin and ER-Studio.
  • Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL

TECHNICAL SKILLS

Development Tools: Confidential SQL Server (2005/2008/2008 R2/2012), Oracle 8i/9i, Oracle 10g/9i DB2, Postgre MS Access, NoSQL Databases (HBase).

Database Tools: MS SQL Server (2005/2008/2008 R2/2012), MS-Access, SQL Server Integration Services(SSIS), Data Transformation Services(DTS), SQL Server Reporting Services(SSRS)

Development skills: T-SQL, PL/SQL, SSIS/SSRS, SQL PLUS

Languages: VB.net, Linux, UNIX

ETL Tools: DTS, SSIS (SQL Server Integration Services), Informatica

Reporting Packages: SQL Server Reporting Services, MS Excel.

Tools: /Methodologies: MS Project, SQL Profiler, Toad, TFS 2010, Agile, Jira, Waterfall.

Operating Systems: Windows XP, Windows 7, Windows 8, Win 2003/2008/2008 R2

PROFESSIONAL EXPERIENCE

Confidential, Chicago, IL

Sr. ETL Talend Consultant

Responsibilities:

  • Collaborating with Data Integration Team to perform data and application integration with a goal of moving more data more effectively, efficiently and with high performance to assist in business- critical projects coming up with huge data extraction.
  • Perform technical analysis, ETL design, development, testing, and deployment of IT solutions as needed by business or IT.
  • Explore prebuilt ETL metadata, mappings, Develop and maintain SQL code as needed for Oracle.
  • Worked on Oracle BI Publisher to extract the data and transform in to flatfiles.
  • Created rtf, etext Templates in Oracle BI to publish the extracted data from oracle.
  • Create reusable Joblets and routines in Talend.
  • Worked on web services using talend components like tSOAP, tREST, tWebService, tWebService Input etc.
  • Troubleshoot data integration issues and bugs, analyze reasons for failure, implement optimal solutions, and revise procedures and documentation as needed.
  • Platform for profiling and comparison of data, which will be used to make decisions regarding how to measure business rules and quality of the data.
  • Responsible to tune ETL mappings, Workflows and underlying data model to optimize load and query Performance.
  • Implemented Performance tuning in Mappings and Sessions by identifying the bottlenecks and Implemented effective transformation logic.
  • Developing ETL framework for Data Masking, Audit, Balance, Control, Validation architecture etc.
  • Worked on end-to-end development of software products from requirement analysis to system study, designing, coding, testing (Unit & Performance), documentation and implementation.
  • Responsible for code migrations from Dev. to QA and production and providing operational instructions for deployments.
  • Experienced in creating Generic schemas and creating Context Groups and Variables to run jobs against different environments like Dev, Test and Prod.
  • Provided the Production Support by running the jobs and fixing the bugs.
  • Utilized Big Data components like tHDFSInput, tHDFSOutput, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput,
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.
  • Implemented Kafka to read Json messages from the provided kafka topic.
  • Implemented Data Integration and migration process with Talend Big Data Integration Suite as part of migration from SSIS to Talend, with destination database from SQL to AWS Redshift and new applications development in ETL
  • Working on Zena Scheduling jobs, Server Monitoring, Task Creation, Plan Creation, Job Deployment etc.
  • Participate and contribute to code reviews, shared modules, reusable
  • Use of JIRA for Source Code Control, project related document sharing and team collaboration.

Skills Used: Talend Big Data 6.4.1, Talend, MDM, Oracle BI, XML files, Flat files, JSON, REST API, SOAP API, ZENA, EBX, Agile Methodology.

Confidential, McLean, VA

Sr. ETL Talend Consultant

Responsibilities:

  • Created complex mappings in Talend 6.2 using tMap, tJoin, tReplicate, tParallelize, tFixedFlowInput, tAggregateRow, tFilterRow, tIterateToFlow, tFlowToIterate, tDie, tWarn, tLogCatcher, tHiveIput, tHiveOutput etc.
  • Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
  • Designed ETL process using Talend Tool to load from Sources to Targets through data Transformations.
  • Developed Talend jobs to populate the claims data to data warehouse - star schema, snowflake schema, Hybrid Schema.
  • Integrated java code inside Talend studio by using components like tJavaRow, tJava, tJavaFlex and Routines.
  • Developed complex ETL jobs from various sources such as SQL server, Postgres sql and other files and loaded into target databases using Talend OS ETL tool.
  • Implemented File Transfer Protocol operations using Talend Studio to transfer files in between network folders.
  • Worked on analyzing Hadoop cluster and different big data analytic tools including Pig HBase database and Sqoop.
  • Performed transformations, cleaning and filtering on imported data using Hive, MapReduce, and loaded final data into HDFS.
  • Handled importing of data from various data sources using Sqoop, performed transformations using Hive, MapReduce and loaded data into HDFS.
  • Excellent knowledge of NOSQL on Mongo and Cassandra DB.
  • Developed jobs in Talend Enterprise edition from stage to source, intermediate, conversion and target.
  • Developing UNIX shell scripts for automating and enhancing streamlining existing manual procedures.
  • Data migration from relational (Oracle.Teradata) databases or external data to HDFS using Sqoop and Flume & Spark.
  • Developed Talend ETL jobs to push the data into Talend MDM and develop the jobs to extract the data from MDM.
  • Developed Talend ESB services and deployed them on ESB servers on different instances.
  • Developed data validation rule in the Talend MDM to confirm the golden record.
  • Designed both Managed and External tables in Hive to optimize performance.
  • Creating complex user provisioning for the company employees to run on daily basis.
  • Experienced in working with Tac(Talend administration Console).
  • Developed error logging module to capture both system errors and logical errors that contains Email notification and moving files to error directories.
  • Handled the Errors at Control and Data flow level in SSIS Packages.
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings.
  • Developed mapping parameters and variables to support SQL override.
  • Developed Talend ESB services and deployed them on ESB servers on different instances.
  • Created mapplets & reusable transformations to use them in different mappings.
  • Developed mappings to load into staging tables and then to Dimensions and Facts.
  • Developed the Talend jobs and make sure to load the data into HIVE tables & HDFS files and develop the Talend jobs to integrate with Teradata system from HIVE tables
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Unit testing, code reviewing, moving in UAT and PROD.
  • Designed the Talend ETL flow to load the data into hive tables and create the Talend jobs to load the data into Oracle and Hive tables.
  • Migrated the code into QA (Testing) and supported QA team and UAT (User).
  • Created detailed Unit Test Document with all possible Test cases/Scripts.
  • Working with high volume of data and tracking the performance analysis on Talend job runs and session.
  • Conducted code reviews developed by my team mates before moving the code into QA.
  • Experience in Batch scripting on windows, Windows 32 bit commands, Quoting, Escaping.
  • Used Talend reusable components like routines, context variable and globalMap variables.
  • Provided support to develop the entire warehouse architecture and plan the ETL process.
  • Knowledge on Teradata Utility scripts like FastLoad, MultiLoad to load data from various source systems to Teradata.
  • Modified existing mappings for enhancements of new business requirements.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Configured the hive tables to load the profitability system in Talend ETL Repository and create the Hadoop connection for HDFS cluster in Talend ETL repository
  • Works as a fully contributing team member, under broad guidance with independent planning & execution responsibilities.

Skills Used: Talend Big Data 6.2, Talend MDM, Hive, Oracle 11G,AWS,XML files, Flat files, HL7 files, JSON, Talend Administrator Console, IMS, Agile Methodology.

Confidential

SQL Developer

Responsibilities:

  • Created tables, temporary tables, table variables, indexes, procedures, views and triggers for automating various tasks in TSQL.
  • Performance tuning of SQL queries and stored procedures using SQL Profiler and Index Tuning Wizard and SQL Sentry plan explorer.
  • Worked on internal project to develop a performance Data warehouse.
  • Monitoring and Releasing reports in a timely manner to ensure the correct reports are reached to end users without data discrepancy.
  • Designed large number of reports using table filters, single value parameters, multi value parameters, dependent parameters and Cascading Parameters.
  • Participated in creating reports that deliver data based on stored procedures.
  • Performed Index analysis for tables and came up with more efficient solutions to use Clustered and Non-Clustered Indexes for significant performance boost.
  • Developed Multi-dimensional Objects using MS Analysis Services.
  • Developed the SQL programs and Involved in optimize stored procedures and queries for faster retrieval and Server efficiency.
  • Designed and deployed various complex reports using MS Reporting Services.
  • Created Tabular reports, Cross Tab Reports, Matrix, Sub Reports and Parameterized reports.
  • Analyzed reports and fixed bugs in stored procedures.
  • Designed and implemented stored procedures and triggers for automating tasks.
  • Involved in Trouble Shooting Complex Reports.
  • Created Large Number of Report Models for users to create their own reports.
  • Fine-tuned SQL queries for maximum efficiency and performance.
  • Configured the report viewing security for various users at various levels using Report Manager.
  • Monitoring and Releasing reports in a timely manner to ensure the correct reports are reached to end users without data discrepancy.

Environment: Microsoft SQL Server 2012, Microsoft Excel 2010, Power Pivot, Team Foundation Server (TFS), Windows 2008.

Confidential

SQL Developer

Responsibilities:

  • Analyzed business requirements and build logical data models that describe all the data and relationships between the data using data vault.
  • Created new database objects like Procedures, Functions, Packages, Triggers, Indexes and Views using T-SQL in SQL Server 2005.
  • Validated change requests and made appropriate recommendations. Standardized the implementation of data.
  • Promoted database objects from test/develop to production. Coordinated and communicated production schedules within development team.
  • Created the DTS Package through ETL Process to vendors in which records were extracts from Flat file and Excel sources and loaded daily at the server.
  • Involved in the development of a normalized database using Data Definition Language (DDL) in T-SQL.
  • Performed job scheduling in the SQL Server Environment.
  • Used Data Modification Language (DML) to insert and update data, satisfying the referential integrity constraints and ACID properties.
  • Modified database structures as directed by developers for test/develop environments and assist with coding, design and performance tuning.
  • Managed user access. Created and managed new security accounts.
  • Backup and restore databases.
  • Developed and implemented database and coding standards, involved in improving performance and maintainability of corporate databases.

Environment: SQL Server 2000/2005/2008, T-SQL, DTS Designer, MS-Office, Ms-Excel, VSS

Hire Now