Sr. Etl/talend Developer Resume
PROFESSIONAL SUMMARY:
- Over 7 years of experience in the IT industry with strong background in various integration technologies like iPaaS and tools such as Dell Boomi and on - premise integration tool as Oracle SOA Suite, ETL tools like Talend. Database technologies like Oracle, DB2 and SQL Server.
- Wide range of progressive experience in providing product specifications, design, analysis, development, documentation, coding and implementation of the business technology solutions in Data warehousing applications.
- Extensive experience in development and maintenance in a corporate wide ETL solution using SQL, PL/SQL, TALEND 5.x/6.x/7.x on UNIX and Windows platforms.
- Strong knowledge on Azure API Management.
- Proficient ETL Developer experience in Data Warehouse, ETL Maintenance, Master Data Management (MDM) strategy, Data Quality and Big Data Eco Systems.
- Strong experience with Talend tools - Data integration, big data and experience in Data Mapper, Joblets, Meta data and Talend components, jobs
- Extensive experience in integration of various heterogeneous data sources definitions like SQL Server, Oracle, Flat Files, Excel files loaded data into Data warehouse and Data marts using TalendStudio.
- Experiences on databases like MySQL, Oracle using RDS of AWS.
- Experienced in ETL TALEND Data Fabric components and used features of Context Variables, MySQL, Oracle, Hive Database components.
- Experience in scheduling tools Maestro’s web, Rancher using Docker, Control M & Job Conductor (Talend Admin Console).
- Worked for different data formats, such as FlatFile, DB, JSON, XML and CSV.
- Experience in orchestration and integration technologies in the areas of Web Services, SOAP, WSDL, XML and XSD.
- Extensively created mappings and Talend stacks in TALEND using tMap, tJoin, tReplicate, tConvertType, tFlowMeter, tLogCatcher, tNormalize, tDenormalize, tJava, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc.
- Excellent Experiences on NOSQL databases like HBase and Cassandra.
- Excellent understanding of Hadoop architecture, Hadoop Distributed File System and API's.
- Extensive knowledge of business process and functioning of Heath Care, Manufacturing, Mortgage, Financial, Retail and Insurance sectors.
- Strong skills in SQL and PL/SQL, backend programming, creating database objects like Stored Procedures, Functions, Cursors, Triggers, and Packages.
- Experience in AWS S3, EC2, SNS, SQS setup, Lambda, RDS (MySQL) and Redshift cluster configuration.
- Experienced in Waterfall, Agile/Scrum Development.
- Strong Experiences in developing ETL processes using Informatica, including ETL control tables, error logging, auditing, data quality etc.
- Good knowledge in implementing various data processing techniques using Pig and MapReduce for handling the data and formatting it as required.
- Extensively use ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from wide variety of source systems like SQL Server, Oracle, DB2 and non-relational sources like XML, flat files, and mainframe Files.
- Well versed in developing various database objects like packages, stored procedures, functions, triggers, tables, indexes, constraints, views in Oracle11g/10g
- Experienced in Code Migration, Version control, scheduling tools, Auditing, shared folders and Data Cleansing in various ETL tools.
- Good communication and interpersonal skills, ability to learn quickly, with good analytical reasoning and adaptive to new and challenging technological environment.
- Strong Team working spirit, relationship management and presentation skills.
- Expertise in Client-Server application development using MS SQL Server … Oracle … PL/SQL, SQL *PLUS, TOAD and SQL*LOADER. Worked with various source systems such as Relational Sources, Flat files, XML, Mainframe COBOL and VSAM files, SAP Sources/Targets etc.
TECHNICAL SKILLS:
ETL/Middleware Tools: Talend 5.X/6.X/7.X, Dellboomi, SSIS
Data Modeling: Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, Fact and Dimension tables, Physical and Logical Data Modeling.
RDBMS: Oracle 11g/10g/9i, Netezza, Teradata, Redshift, MS SQL Server 2014/2008/2005/2000, DB2, MySQL, MS Access.
Programming Skills: Java, SQL, Oracle PL/SQL, Unix Shell Scripting, HTML, XML, Netezza.
Tools: TOAD, SQL Plus, SQL*Loader, Quality Assurance, Soap UI, Share Point, IP switch user, Teradata SQL Assistant.
Operating Systems: Windows 8/7/XP/NT/2x, Unix-AIX, Sun Solaris 8.0/9.0.
PROFESSIONAL EXPERIENCE:
Confidential
Sr. ETL/Talend Developer
Responsibilities:
- Participated in Requirement gathering, Business Analysis, User meetings and translating user inputs into ETL mapping documents.
- Designed and customized data models for Data warehouse supporting data from multiple sources on real time
- Involved in building the Data Ingestion architecture and Source to Target mapping to load data into Data warehouse
- Worked with Data mapping team to understand the source to target mapping rules. Analyzed the requirements and framed the business logic and implemented it using Talend.
- Involved in ETL design and documentation.
- Developed Talend jobs from the mapping documents and loaded the data into the various target Data warehouses.
- Involved in end-to-end Testing of Talend jobs.
- Created a number of Java routines for complex transformation and utilities to use across the integrations..
- Experience in migrating the data to cloud data warehouses, and Salesforce CRM from legacy systems, like Oracle, Sugar CRM, Netezza, Salesforce.
- Developed and Designed SOAP/REST Services using SOA/OSB to support the data on the User Interface like Mobile and Web applications
- Worked with business to collect requirements, design the integration, document the specifications and communicate and coordinate the development and testing effort. As part of the change management team, revisit and readdress P2P policy and procedures impacted by these initiatives
- Worked with multiple ICS adapters (File, FTP, Salesforce, ERP Cloud, HCM, Twillo, DB, SOAP & REST)
- Developed a migration utility to automatically migrate/deploy and activate interfaces from one Oracle ICS Cloud Environmet to another using CI / CD tools such as Bit Bucket Pipelines and Oracle ICS REST API’s.
- EDI conversion XML to EDI and EDI to XML using b2b adaptor.
- Continuous Integration and Continuous Delivery (CI-CD), Dependency Injection(DI), Inversion of Control (IoA), Github, Source Tree, BitBuckuct, Maven, Jenkins
- Worked on the cloud data storages like Amazon S3, Azure data lakes,
- Worked with the Amazon S3 and EC2 components to migrate the Data from Different source systems to Amazon S3 buckets.
- Gained a knowledge on Azure services also achieved the Azure certification.
- Wrote complex SQL queries to take data from various sources and integrated it with Talend.
- Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.
- Involved in loading the data into Netezza from legacy and flat files using Unix scripts. Worked on Performance Tuning of Netezza queries with proper understanding of joins and Distribution
- Created ETL job infrastructure using Talend Open Studio.
- Responsible for MDM for Customer DataTalend MDM Customers Suppliers Products Assets Agencies Stores Address Standardizations and Reference Data Employees MDM is about creating and managing the golden records of your business
- Developed the business rules for cleansing/validating/standardization of data using Informatica Data Quality.
- Developed standards for ETL framework for the ease of reusing similar logic across the board.
- Analysed requirements, create design and deliver documented solutions that adhere to prescribed Agile development methodology and tools
- Developed mappings to extract data from different sources like DB2, XML files are loaded into Data Mart.
- Created complex mappings by using different transformations like Filter, Router, lookups, Stored procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.
- Responsible for develop the jobs using ESB components like tESBConsumer, tESBProviderFault, tESBProviderRequest, tESBProviderResponse, tRESTClient, tRESTRequest, tRESTResponse to get the service calls for customers DUNS numbers.
- Creating Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and development and production environment structures.
- Troubleshoot database, Joblets, mappings, source, and target to find out the bottlenecks and improved the performance.
- Involved rigorously in Data Cleansing and Data Validation to validate the corrupted data.
Environment: Talend 6.x, XML files, DB2, Oracle 11g, Netezza 4.2, SQL server 2008, SQL, MS Excel, MS AccessUNIX Shell Scripts, Talend Administrator Console, Cassandra, Oracle, Jira, SVN, Quality Center, and Agile
Methodology, TOAD, Autosys.
Confidential .
Sr. Talend / ETL Developer
Responsibilities:
- Worked on SSAS in creating data sources, data source views, named queries, calculated columns, cubes, dimensions, roles and deploying of analysis services projects.
- SSAS Cube Analysis using MS-Excel and PowerPivot.
- Converted some of the existing integration jobs from SSIS and Informatica to Talend 6.3, and documented the process.
- Implemented SQL Server Analysis Services (SSAS) OLAP Cubes with Dimensional Data Modeling Star and Snowflake Schema.
- Developed standards for ETL framework for the ease of reusing similar logic across the board.
- Analyse requirements, create design and deliver documented solutions that adhere to prescribed Agile development methodology and tools.
- Responsible for creating fact, lookup, dimension, staging tables and other database objects like views, stored procedure, function, indexes and constraints.
- Monitoring the Data Quality, generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing exiting production ETL Process
- Developed complex Talend ETL jobs to migrate the data from flat files to database.
- Implemented custom error handling in Talend jobs and also worked on different methods of logging.
- Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs
- Responsible for develop the jobs using ESB components like tESBConsumer, tESBProviderFault, tESBProviderRequest, tESBProviderResponse, tRESTClient, tRESTRequest, tRESTResponse to get the service calls for customers DUNS numbers..
- Exposure of ETL methodology for supporting Data Extraction, Transformation and Loading process in a corporate-wide ETL solution using Talend Open Source for Data Integration 5.6. worked on real time Big Data Integration projects leveraging Talend Data integration components.
- Analyzed and performed data integration using Talend open integration suite.
- Wrote complex SQL queries to inject data from various sources and integrated it with Talend.
- Worked on Talend Administration Console (TAC) for scheduling jobs and adding users.
- Worked on Context variables and defined contexts for database connections, file paths for easily migrating to different environments in a project.
- Developed mappings to extract data from different sources like DB2, XML files are loaded into Data Mart.
- Created complex mappings by using different transformations like Filter, Router, lookups, Stored procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Mart.
- Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.
- Developed Talend ESB services and deployed them on ESB servers on different instances.
- Created WSDL data services using Talend ESB.
- Created Rest Services using tRESTRequest and tRESTResponse components.
- Used tESBConsumer component to call a method from invoked Web Service.
- Scheduling and Automation of ETL processes with scheduling tool in Autosys and TAC.
- Scheduled the workflows using Shell script.
- Used Talend most used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput & tHashOutput and many more)
- Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures.
- Developed stored procedure to automate the testing process to ease QA efforts and reduced the test timelines for data comparison on tables.
- Automated SFTP process by exchanging SSH keys between UNIX servers.
- Worked Extensively on Talend Admin Console and Schedule Jobs in Job Conductor.
- Involved in production n deployment activities, creation of the deployment guide for migration of the code to production, also prepared production run books.
- Creating Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and development and production environment structures.
Environment: Talend 5.x,5.6, XML files, DB2, Oracle 11g, SQL server 2008, SQL, MS Excel, MS Access, UNIX Shell Scripts, TOAD, Autosys.
Confidential
Software Developer
Responsibilities:
- Monitoring Log shipping/Replication and troubleshooting of errors.
- Created Linked Servers between SQL server 2000 & Oracle 9i.
- Wrote complex stored procedures, User Defined Functions, Triggers using T-SQL.
- Created DTS packages for data transfer between the two environments.
- Security issues related to logins, database users, and application roles and linked servers.
- Performance tuning of SQL queries and stored procedures using SQL profiler and Index tuning advisor.
- Administered of all SQL server database objects, logins, users and permissions in each registered server.
- Resolved any deadlocks issues with Databases/Servers on a real-time basis.
- Wrote scripts for generating Daily Backup Report, verifying completion of all routine backups, log space utilization monitoring etc.
- Backup and Restoration of data/databases using third party tool (SQL LITE Speed).
- Involved in Design and Development of Disaster Recovery Plan.
- Created reports using Crystal Reports.
Environment: SQL Server 2000 Enterprise Edition, Windows 2000/NT, UNIX, Excel, SQL Profile, Replication, DTS, MS Access, T-SQL, Crystal Reports.