We provide IT Staff Augmentation Services!

Sr. Etl Lead/talend Developer Resume

4.00/5 (Submit Your Rating)

Charlotte, NC

SUMMARY

  • Over 9 Years of diversified IT experience in all phases of Analysis, Design, Development, Implementation, Administration and support of ETL - Data Warehousing and Big Data applications.
  • Served on Variety of roles as an Infrastructure Architect, Developer, Analyst, Admin,Prod support on environments including Various ETL tools (Informatica Big Data Edition and Talend), Databases (Oracle,SQL server,Netezza,Teradata),Unix(RHEL,AIX),Scheduling Tools (Autosys and Control-M).
  • Conduct proof of concept/Pilot Implementations on new/evolving software in teh Hadoop ecosystem and provide recommendations.
  • Configure and Administer Lab environments with new installations of MAPR/CDH/HDP and other open source tools and 3rd party applications
  • Experience in designing, developing, and implementing Extraction, Transformation, and Load (ETL) techniques on multiple database platforms and operating system environments.
  • Working Experience on Multi-node Hadoop cluster environments, handling all Hadoop environment builds, including design, capacity planning, cluster setup including utilities like Hive, Hue and SQOOP.
  • Experience in building and deploying Talend ETL solutions over Hadoop/Big data environment.
  • Expertise in developing, implementing and executing projects in Finance, Health care, Pharmacy and other Domains.
  • Worked on data integration, data modeling, data profiling, data quality with hands-on Knowledge on specialized tools like Informatica Data Quality (IDQ) and Informatica Metadata Manager (IMM).
  • Knowledge on Data and Dimensional modeling, normalization, E-R diagrams, Logical, Physical design, Star and Snow flake Schemas using Kimball and Inmon DW methodologies.
  • Proficient in gathering business requirements, establishing functional specifications and translating them to technical design specifications.
  • Proficient in data warehousing techniques for Data cleansing, Slowly Changing Dimension phenomenon’s (SCD) and implemented various Performance Tuning measures at various levels-mapping, session,DB and OS.
  • Experience in Integration of various Relational data sources like Oracle, Teradata, SQL Server and Mainframes and Flat Files into Staging Area.
  • Experience with Oracle 12c/11g and in writing Complex SQL queries, PL/SQL stored procedures, functions, packages triggers and materialized views.
  • Expertise in developing and maintaining overall Test Methodology and Strategy, Documenting Test Plans, Test Cases, Executing Test Cases and Test Scripts and perform data validation by Unit Testing, System and Integration Testing and UAT.
  • Experience on implementing projects in Waterfall,Agile and Kanban methodologies.
  • Experience using incident/change management tools like Remedy and ticketing tools like JIRA.
  • Knowledge of Java and other scripting languages (Unix shell).
  • Knowledge on Reporting tools like Microstrategy, OBIEE, Business Objects with an idea of Universes, Reports and Adhoc reports.
  • Created UNIX scripts for automation of workflows and experience in using Autosys/Control M job scheduler for automation of Unix Shell scripts and batch scheduling.

TECHNICAL SKILLS

Data Warehousing Tools: Talend Big Data Studio, Informatica BDE, Power Center 10.x/9.x, Power Exchange 9.x, IDQ, IMM-Metadata Manager

Data Modelling Tools: Erwin Data Modeler r7, MS Visio 2010

Databases: Oracle 12c, Teradata 15, IBM DB2, MS SQL Server 2014, Netezza, HIVE, NoSQL,MySQL,HBASE

Reporting Tools: Microstrategy

Scheduling Tools: Autosys, Control-M

RDBMS Utilities: SQL*Plus, SQL*Loader, Toad, PL/SQL Developer, BTEQ, Netezza Workbench

Programming: UNIX Shell Scripting, PL/SQL,PERL,PYTHON,Scala

Operating Systems: UNIX(RHEL,Solaris 10, AIX-7.1), Mainframes, Windows, Apache Hadoop and HDFS

PROFESSIONAL EXPERIENCE

Confidential, Charlotte,NC

Sr. ETL Lead/Talend Developer

Environment: Informatica 10.1 BDE,IDQ, Oracle 12c,RHEL 6.x, shell (Korn/bash) Scripting, PAC2000,NDM,Autosys R11.3,SVN, Apache Hadoop, HDFS,CDH 5.0,MAPR 5.0, Talend Integration suite for Big Data 5.6

Responsibilities:

  • Develop a Pilot Implementation Project on 10-node MAPR cluster to test Informatica BDM functionality for dynamic mapping, Pushdown using Hive, Read/write to hive.
  • Administered a 10 node MAPR LAB cluster on RHEL leveraging Informatica BDE and Talend from edge nodes.
  • Developed Bigdata Jobs leveraging HDFS/Hive for Data Ingestion from various sources on both MAP REDUCE and SPARK framework and used Sqoop to import/export to and from HDFS to Relational Databaseand developed Oozie workflows.
  • Ingested Data into Both Internal and External Hive tables using both HiveCLI andBeeline.
  • Develop a proof of concept (POC) for Informatica connectivity to Splice Machine using ODBC/JDBC on Cloudera Hadoop Distribution 5.0
  • Develop a POC for Informatica Power Exchange for HDFS and Hive on multinode Hadoop distribution on Cloudera
  • Extensively used Informatica Developer Tool in BDE to create Data Ingestion jobs into HDFS and to evaluate dynamic mapping capabilities.
  • Worked on Talend Big Data studio and Talend Data Mapper to compare Functionality and execution times with Informatica BDE 10.1.
  • Created Jobs to load data to/from relational DB using both SqoopImport and SqoopExport specifying Mappers and splits.
  • To provide tier 2 infrastructures and application technical support for teh Information HUB,WISEMatching service and Metadata Manager Informatica and Oracle platform as part of teh IHUB Technical Support team.
  • Worked on Manta flow in conjunction with IMM (Informatica Metadata Management) to create Data lineage Diagrams, Business glossary and metadata models for Data Governance and Impact Analysis.
  • Extensively used Informatica Developer Tool in BDE to create Data Ingestion jobs into Hadoop Datalake and to evaluate dynamic mapping capabilities.
  • Used Informatica Web Services to process SOAP requests and to parse wsdl/xml data and sales force connectors to write to force.com.
  • Worked on teh full life cycle implementation of realtime cloud applications usingSales force (SFDC) and integration with Informatica cloud solution.
  • Used Informatica Proactive Monitoring's Rule point complex and Real time Alert Manager to Monitor objects and events and Deliver Alerts.
  • To work with application teams to enable data consumption, root cause analysis, performance analysis in addition to troubleshooting and issue resolution and provide technical consultation, utilizing a thorough understanding of applicable technology, tools and existing designs.
  • Responsible for triaging production support issues and questions impacting technical teams and business community for teh mission critical Wholesale data distribution platform.
  • Assist in teh resolution of issues from technology teams and questions from new individual seeking assistance in determining data availability, driving teh investigation of root cause related to data anomalies, system issues and user access requests.
  • Architected teh technical refresh project intended to migrate teh Informatica platform, Oracle databases, Storage(NAS/SAN),Utility Tools (Autosys,NDM) in conjunction with Data Center Alignment.
  • Acting as an ETL liaison between Operating system Engineers (Unix) and NAS engineers with setting up of NAS filers including allocation,replication,retention, snapshots and backups on both Netapp and Isilon platforms and Network Engineers in setting up of F5 load balancers for applications like Oracle,NDM on top of VIP(local) and WIP (global).
  • Migrating/Provisioning new users, groups and adding servers to host groups and ACL using BOKS and setting up single sign on using Active Directory and LDAP Authentication.
  • Configuring NDM Node Names(Connect: Direct) using F5 (Load Balancers) including netmaps and userfiles for both inbound and outbound file/Data transmissions.
  • Design and Configuring teh NAS sizing, Backups and replication aspects and export of filers to server hosts.
  • Adding Alias/CNAME updates and IP re-assignments for DNS hosts.
  • Remediating Vulnerability patches as part of safety and soundness on Linux and Windows servers.
  • Design and Implement teh BCP Failover and Failback Strategy.

Confidential, Birmingham, AL

Sr. ETLLead Developer

Environment: Informatica PowerCenter 9.5.1, Power Exchange 9.5.1, Informatica Developer (IDQ and IDE), Metadata Manager (Thin Client), Data Masking, Talend Open Studio, Oracle 11g, Microsoft SQL Server 2008 R2, Toad for Oracle 12.1, IBM DB2, AIX UNIX 7.1, Red Hat Linux, shell (Korn/bash) Scripting, Remedy, JIRA 6.2,Control-M.

Responsibilities:

  • Involved in gathering business requirements, establishing functional specifications and translating them to design specifications.
  • Knowledge on dimensional modeling to design and develop teh STAR Schema using Erwin.
  • Worked as an Administrator on creating users, groups, folders, categories and connections in teh repository and assigned access privileges.
  • Developed mappings using Informatica Powercenter 9.5.1and Power Exchange to load data from multiple sources and file systems such as Cobol VSAM, Flat files, SEQ files into Oracle tables.
  • Extensively Used Informatica client tools like Source Analyzer, Target designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source & Target definitions and coded teh process of data flow from source system to data warehouse.
  • Developed PL/SQL procedures, for creating/dropping of indexes on tables using target pre-load and post-load strategies to improve session performance in bulk loading, for gathering statistics and archiving table Data.
  • Created pre-session and post session UNIX scripts for file delete, rename, archive and zipping tasks.
  • Responsible for Testing and Validating teh Informatica mappings against teh pre-defined ETL design standards.
  • Implemented Test plans and test cases for unit-test and peer-review Test and Fixed bugs and validated mappings.
  • Worked as production support and in finalizing scheduling of workflows using Control M tool and Deployed teh Workflows in teh production repository and server and supported them through automated Shell scripts.
  • Used Jira for issue and project tracking for Kanban methodology and real time reporting.
  • Coordinated with teh reporting team to develop teh OLAP reports using Business Objects and supported work for creating Universes, Reports and Adhoc reports using Business Objects.

Confidential, CA

Informatica Developer

Environment: Informatica Power Center 8.6.1,Toad 8.5,Oracle 9i, PL/SQL, Teradata V2R4,Flatfiles,Solaris 9, Microsoft Visual SourceSafe 6.0,Windows Power Shell, Microsoft SQL server 2005/2008,Windows server 2003,Solaris 10, shell (Korn/bash) Shell Scripting, Autosys.

Responsibilities:

  • Analyzed and provided data for ETL framework for Claims processing Engine, a process designed and customized by external vendors for teh client.
  • Comprehensively analyzed and systematically documented teh end-to-end flow of teh Inbound and Outbound claims file (837/835/834)
  • Worked on Informatica B2B Data Transformation which supports transformations and mappings via XML ofmost healthcare industry standards including HIPAA 275,277 and HL7.
  • Worked with NCPDP billing unit standards and HIPAAX12-5010 standards including 837/835.
  • Developed mappings in Informatica to load teh data from various sources into teh Data Warehouse using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter and Source Qualifier.
  • Created complex Informatica mappings, reusable objects of Mapplets depending on client requirements.
  • Used Workflow Manager for Creating, Validating, Testing and running teh sequential and concurrent Batches and Sessions and scheduling them to run at specified time with required frequency.
  • Used Autosys for automating Batches and Session.

We'd love your feedback!