Squad Lead - Talend Consultant Resume
Cincinnati, OH
SUMMARY:
- Software consultant with more around 12years of experience mainly in Business Intelligence, Data Warehousing, Big Data assignments using Talend, Teradata, Informatica, Oracle, Hawq, Greenplum.
- Played multiple roles of Systems Analyst, Architect and ETL Developer during my career.
- Currently working as Talend Consultant for Confidential in building Data Lakes.
- Worked for Cisco Systems Inc, in managing and delivering Data Warehouse projects using Teradata and Informatica Power center.
- Experience on working with Data lakes in HAWQ, Greenplum (HDFS) and conceptual knowledge on Hive, Python, AWS and MongoDB acquired during project executions.
- Responsible to design and implement complex highly scalable solutions that comply with client requirements. Proficient in analyzing and translating business requirements to technical requirements and architecture.
- Experienced in working with Teradata ETL utilities like Fast Load, Multi Load, TPump and Fast Export. Also, worked extensively on Teradata Query Submitting and processing tools like BTEQ and Teradata SQL Assistant.
- Hands on experience in Designing and Developing Informatica ETLs (Pushdown Optimization techniques, Performance tuning)
- Have very good experience in trouble - shooting techniques, tuning SQL statements, Query Optimization, Join Indexes.
- Basic working experience and Knowledge on UNIX.
- Extensively worked on Domains like Finance, Aviation.
- Pursuing Diploma in Data Science.
TECHNICAL EXPERTISE:
BigData Tools: Talend6x, Hawq, Greenplum.
ETL Tools: Informatica Power Center 9x
RDBMS: Teradata14, Oracle
TD Utilities: TD SQL Assistant, BTEQ, Tpump, Fastload, FastExport, MultiLoad
Scheduling Tools: $Universe, TAC
Languages: SQL, PL/SQL, C, C++, Python.
Operating Systems: Windows and Unix
PROFESSIONAL EXPERIENCE:
SQUAD LEAD - TALEND CONSULTANT
Confidential, Cincinnati, OH
Environment: Talend6x, Hawq, Greenplum, Postgresql, Mongodb, Python, Unix.
Responsibilities:
- Preparation of Technical/Business documentation.
- Involved in Setting up the Big data environment for Confidential DataLake from scratch.
- Connecting various source system with the HDFS file system and delivery of quality data to the end customers.
- Involved in Source Analysis, Job design and development using Talend Open studio.
- Communicate frequently with GE team leader on design and strategy.
- Involved in migration of huge volumes of data from legacy system to Big DataLake System using Talend.
- Involved primarily in Metar and Environmental Data Integrations.
- Involved in creating new Talend jobs and associated tasks using various Talend Components, python scripts during the Data Ingestion to DataLake.
- Execute and schedule new and existing Talend jobs using TAC.
- Support Talend jobs and troubleshoot and fix issues with these jobs when they arise.
- Involving in requirement gathering, business analysis, user meetings, discussing the issues to be resolved and translating user inputs into meaningful outcome.
- Built best practices & guidelines in Talend data integration approaches required for source system integrations in Data Lake of Greenplum, Hawq.
- Involved in Data Migration from Teradata to Green Plum Data lake.
Confidential
Systems Analyst
Responsibilities:
- Requirements analysis, data assessment, business process reengineering.
- Involved in complete SDLC (System Development Life Cycle).
- Researched Sources and identified necessary Business Components for Analysis.
- Interacted with business-users and business analysts to identify and develop business requirements and transform it into technical requirements and ultimately will be responsible for delivering the solution.
- Actively involved in analyzing, designing and development of the project.
- Tuned SQL statements, Query Optimization and analysis in query performance issues in Teradata
- Proficient in understands Teradata EXPLAIN plans, Collect Stats option, Secondary Indexes (USI, NUSI), Partition Primary Index (PPI), Volatile, global temporary, derived tables etc.
- Used Load utilities (BTEQ, Mload, Fast Load) to load the data into Teradata.
- Designed and developed Informatica Mappings and Sessions, Workflows based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- Developed and scheduled Workflows using workflow designer, in Workflow manager and monitored the results in Workflow monitor.
- Designed, developed and documented the ETL processes using Informatica for extracting the data and data transformations.
- Created mappings and applied transformations on the tables and created sessions for running the processes in Informatica.
- $U design and development, Kintana, PVCS
- Dollar Universe tool as scheduling tool which calls the Informatica workflows through Unix shell scripting
- QA Support and deployment support
Confidential
Technical Lead
Environment: Teradata, Informatica, Oracle, $Universe, TD Manager, Kintana, Unix, BTEQ.
Responsibilities:
- Understanding the client requirements and prepare understanding and design documents
- Managing offshore team and distributing deliverables from onsite to offshore and helping the team for any clarifications.
- Involved in the analysis of total impact in bringing the change to Bookings measure and other Revenue Measure tables.
- Interact with onsite counterpart and plan activities.
- Document and upload the various artifacts required for process related audits.
Confidential
Technical Lead
Environment: Teradata, Informatica, Oracle, $Universe, Kintana, Unix, BTEQ.
Responsibilities:
- Planning the offshore activities, timelines, daily deliverables etc.
- Interacting with client regarding requirements and attend LDM/PDM review meetings.
- Analyzing and understanding the functional, data and other testing requirements.
- Involved in Source System Analysis and Code reviews for production fixes
- Supporting IT / Business team for User / Business Acceptance Testing.
Confidential
Technical Lead
Environment: Teradata, Informatica, Oracle, $Universe, Kintana, Unix, BTEQ.
Responsibilities:
- Developing Informatica mappings and writing queries and BTEQ scripts in Teradata.
- Involved in Gap Analysis for General Ledger Finance Track associated with different subject areas in Finance.
- Optimized Informatica mappings and SQL scripts in Teradata for better performance
- Involved in various releases starting from release 1 to release 12 which involves development, maintenance, support.
- Prepared Code Migration Document (CMD) for different releases and supporting deployment of code to test and production environments.
- Creating design and other related documents
- Involved mostly in Scheduling the ETL jobs using $U for different job groups.
Confidential
Sr. ETL Developer
Environment: Informatica, Oracle, SQL Server, Unix
Responsibilities:
- Understanding existing business model and client requirements.
- Extensively involved in Data Extraction, Transformation and Loading (ETL process) from source to target systems using Power center.
- Extracted source data from legacy system of flat files and moved into Oracle staging area, then transferred to Oracle using ETL process with various business rules, mappings, mapplets and transformations.
- Played an active role in performance tuning in Informatica.
- Used Source Analyzer and Warehouse Designer to import source and target database schemas and Mapping Designer to map data from sources to targets.