We provide IT Staff Augmentation Services!

Etl Developer & Team Lead Resume

5.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY:

  • Techno functional consultant on Data warehousing with an experience of 8 years using various ETL tools like Informatica, Informatica Data Quality (IDQ), Informatica Cloud (IICS), SSIS and database programming and management using MS SQL Server, T - SQL, Oracle, DB2 and Teradata.
  • Possess excellent Life Sciences and Health Care (LSHC) domain knowledge and a certified healthcare professional.
  • Experience as a Team Leader in leading and guiding the team in fulfilling the set-out objectives within the stipulated timelines.
  • Experience in Designing and developing data integrations using Informatica Cloud (IICS).
  • Implemented ETL best practices and aided in identifying, evaluating, and developing ETL solutions on IICS to the client as per business requirement.
  • Experience in understanding business requirement and converting them into ETL high level and low-level design documents which will guide the technical team in performing the assigned tasks.
  • Developed and deployed Informatica cloud components by implementing exception handling and roll back strategies.
  • Experience in performing Data Profiling, Data Cleansing & Standardization, Address Validation, and Match & Merge using Informatica IDQ and determining the quality of data based on the data quality rules.
  • Good working knowledge and implementation experience in both Kimball & Inmon data warehousing models.
  • Good working knowledge in Data Modeling using Dimensional Data modeling, Star Schema/Snow flake schema, FACT & Dimensions tables, Physical & logical data modeling.
  • Strong hands on experience in ETL, creating, debugging complex Informatica mappings/ transformations and handling sources and targets viz. Oracle, Teradata, Flat files, XML, file transfers via FTP/SFTP and automation of ETL process.
  • Experience in using transformations like aggregator, expression, lookup, joiner, normalizer, rank, router, sorter, union, update strategy etc.
  • Worked with Informatica Designer, Workflow manager, Workflow monitor and Repository manager.
  • Implemented Audit control, Data Balancing and DQ checks while performing ETL loads.
  • Extensive experience with Unix Shell Scripting in creating scripts to perform pre and post Informatica mapping tasks like encrypting a file, sending email notifications etc.
  • Experience with PL/SQL to perform various database activities like insert, update, delete and extract required data from the tables. Conducting performance tuning and query optimization for database queries and SQL stored procedures and views.
  • Experience in creating functions, views and stored procedures.
  • Experience in designing, creating and scheduling workloads and tasks using Tivoli Workload Scheduler (TWS). Perform administration tasks like scheduling jobs, troubleshooting errors, identifying and resolving issues if any and rerunning the job.
  • Good Knowledge in the agile methodology of development with hands on experience on tools like Jira.
  • Proficient in requirements analysis, impact assessment, data analysis, estimations, resource planning and design and development of ETL modules.
  • Experience in working with clients to create functional requirements and design documents, develop cost reduction strategies, implement processes to increase operational efficiencies, and design solutions for improving business intelligence, business analytics and information management capabilities.

TECHNICAL SKILLS:

Data Warehousing: Data Integration, discovery, Migration, Modeling (dimensional & snowflake) & Solutions architecture.

Tools: Informatica Power Centre, Informatica IDQ, SQL Server, Oracle 12c/11g, Teradata, Informatica Cloud (IICS), T-SQL, DB2 and Tivoli Workload Scheduler (TWS) etc.

Databases: SQL Server, Oracle, Teradata, DB2, SQL, Dynamo DB and RDS.

Languages: PL/SQL, Python and Unix shell scripting.

PROFESSIONAL EXPERIENCE:

Confidential

ETL Developer & Team Lead

Responsibilities:

  • Worked on Data Integration Services (DIS), ETL mappings, Data synchronization, Replication tasks and created required Linear task flows.
  • Worked on Data Integration Hub to setup publications and subscriptions to unify, govern and share trusted data is a self-service fashion.
  • Worked on Application Integration services to create Process to govern the data loads using various services of Application Integration like Service (for Web service calls), Assignments, Jump, Decision, Throw etc. Created required Service connectors and Application Connection.
  • Create processes that conduct service calls through APIs that interact with 3rd party applications.
  • Data profiling of various source systems.
  • Designed and developed data architecture & testing approach for all interfaces and integrations.
  • Salesforce integration using Informatica Intelligent Cloud Services (IICS).
  • Performance tuning and enhancing of data loads with solid and flexible design.
  • Data migration and load into Salesforce using data loader.
  • Design develop and Test ETL processes per business requirements.
  • Load/extract data from Salesforce cloud applications and legacy applications like SQL Server.
  • Define and document complex technical design/requirements for integrations and data warehousing
  • Ensure designed systems are reliable, self-recovering and require minimal support.
  • Develop and adhere to established programming, change management processes and standards including naming conventions, formations and program structure.
  • Ability to work independently and collaboratively in the development, testing and implementation lifecycles.
  • Ensure best practices are adhered to and leveraged while using Informatica Cloud.

ENVIRONMNET: Informatica Intelligent Cloud Services (IICS), Linux, Salesforce, Oracle, Vlocity, SQL Server

Confidential

ETL & DQ Team Lead

Responsibilities:

  • Create Enterprise Data Quality Architecture which can be leveraged across applications to measure data quality of attributes from multiple data sources and generate data quality dashboard.
  • To Lead TECH POD Data Assessment Team (DAT) which is responsible for creating solutions which will be leveraged by other applications to validate data quality.
  • Created Metadata mappings, workflows and applications which are used to populate Metadata across applications.
  • Created mapplets, complex data objects (CDO) and logical data objects (LDO), Join analysis, Standardizer and Address doctor which will validate RULES against attributes of different objects and capture the data quality in the form of PASS or FAIL.
  • Created mappings to populate Application Summary table and Enterprise Summary table which contains data quality metrics at specific application level and Enterprise level respectively.
  • Created and scheduled TWS jobs and Job Streams which will measure data quality across applications on a weekly/monthly basis and populate data into Summary tables, which will be used for generating DQ dashboard.
  • Created Audit Framework, which will capture the Summary of the Scheduled run such as status of run, failure records if any and reason for failure along with Start Time and End time which can be leveraged by Operate teams to identify and resolve any issues during scheduled runs.

ENVIRONMNET: Informatica Data Quality (IDQ), MS SQL Server, Unix & Shell Scripting

Confidential

SQL Developer

Responsibilities:

  • Creation of data lineage information of the all source and target tables involved in the Add-on services assignment.
  • Identification and data lineage documentation of GDPR specific tables involving customer p data.
  • Involved in prioritizing all 44 scripts with more than 100k lines of Teradata code with the help of SME’s.
  • Assigning the work load to both onshore and offshore resources.
  • Proactively setting up calls with onshore teams for any clarifications on the identified business rules.
  • Identification and documentation of all possible business rules and setting up calls with SME’s for freezing of rules.
  • Creation of data flow diagrams for each add-on service separately for lite and classic customers.
  • Creation of gap analysis report for the rules identified in the current run and already existing rules.
  • Creation of business rules document based on our analysis, which is leveraged by the SME’s to understand the anomalies in the rules and correcting them.

ENVIRONMENT: SQL Server 2012, Teradata 13.0 & Oracle 12c

Confidential

ETL Developer

Responsibilities:

  • Processing the daily data feed which will be sent by client daily.
  • Generated various reports for the user to view the prices discrepancy between the customer price and P&G price, and the Data mismatches between P&G, client.
  • Generated Upcoming Price Change report which shows the products that have the price change in coming 6 weeks by making use of Informatica mappings and SQL procedures.
  • Created Procedures which allow the user to create new items or make changes to the existing items from the Sales force application.
  • Created mapping, mapplets and workflows to perform the ETL activity from the source system and to load it in SQL Server Database.
  • Experience in working with the transformations like union, sort, expression, aggregator, joiner, rank etc. to meet the ETL demands and objectives.
  • Implemented performance tuning techniques at the mapping level in order to reduce the total run time of mappings and workflows.
  • Experience in debugging and resolving the issues at both Informatica and Database level.

ENVIRIONMENT: Informatica Power Center 9.5, MS SQL Server 2008/2012 & Unix

Confidential

SQL & ETL Developer

Responsibilities:

  • Writing complex stored procedures to achieve very complex business functionality.
  • Implemented multiple PCR which required to redesign the database architecture to a great extent.
  • Achieved great performance improvement by using performance tuning & Index optimization techniques.
  • Created SSIS jobs which run at a frequency of 30 min to pull the data from Oracle database, which helps us in tracking the inventory shipped by Vendors.
  • Created triggers, back up and maintenance jobs to maintain the health of the database.
  • Created views and functions to achieve the business functionality.

ENVIRIONMENT: Informatica Power Centre 8.6, MS SQL Server 2008/2012, Unix & SSIS

We'd love your feedback!