Team Lead Resume
Chattanooga, TN
SUMMARY
- An IT professional with 12+ years of experience in Informatica Powercenter, Informatica Data Quality (IDQ), Informatica DVO, Informatica TDM, Business Objects Universe Designer, Business Objects webIntelligence, Business Objects Desktop Intelligence, Informatica and Business Objects Administration on Linux, HP - UX and windows platform, Oracle, Teradata, Oracle BAM dash boards, SQL Server, Data Analysis, Python and Unix Shell Scripting.
- Technical proficiency in the field of Data Warehousing paired with Data Analysis, Data Modeling, Business Requirements Analysis, Application Design, Performance Remediation, ETL Architect, Development & QA, Informatica support and Administration, Business objects support and Administration
- 2+ years of work experience in Bigdata/ Hadoop Development and Ecosystem Analytics using programming language Python.
- Experience in analyzing data using HiveQL, Impala, Spark SQL and writing custom functions in Python.
- Extensive Experience as a Teradata ETL Developer, Teradata ETL Migration Specialist, Performance Tuning Expert
- Extensively worked in Agile SDLC.
- Expertise in Big data architecture with Hadoop File system and its eco system tools MapReduce, HBase, Hive, Pig, Zookeeper, Oozie, Flume, Impala, Apache spark, Spark Streaming and Spark SQL.
- Hands on experience in Apache Sqoop, Apache Storm and Apache Hive integration.
- Hands on experience working with different File Formats like TEXTFILE, JSON, AVROFILE, PARQUET, ORC for HIVE Querying and Processing.
- Expertise in migration data from different databases (i.e. Teradata, Oracle, DB2,) to HDFS.
- Extensively worked on data extraction, Transformation and loading data from various sources like Oracle, SQL server, XML, Flat files, JMS and other applications.
- Responsible for all activities related to the design, development, implementation and support of ETL processes for large scale data warehouses using Informatica Power Center.
- Extensively worked on Reports development, universe development and Administration of Business objects enterprise.
- Strong experience in Data Warehousing and ETL using Informatica Power Center and Reports using Business objects
- Strong skills in Data Analysis, Data Requirement Analysis, Data modelling, Design and Development for ETL processes.
- Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
- Extensive experience in ETL design, development and maintenance using oracle SQL, PL/SQL, Informatica Power Center.
- Experienced in using IDQ tool for profiling, applying rules and develop mappings to move data from source to target systems.
- Experienced in writing unit test cases in Informatica DVO. Automated all table to table validations scenarios in DVO.
- Applied masking rules for test databases using TDM.
- Worked as Informatica and Business Objects Production Support in 24/7 environment. Monitored nightly jobs and Reports, resolved issues on time.
- Extensive experience in reports development using webi rich client, Universe development using Universe Designer and Business objects enterprise Administration using CMC.
- Well versed in developing the complex SQL queries, unions and multiple table joins and experience with views and materialized views.
- Experience in database programming in PL/SQL (Stored Procedures, Functions, Triggers and Packages) & MS SQL server 2008.
- Well versed in UNIX shell scripting.
- Worked extensively on Teradata Utilities like MLOAD, FASTLOAD, TPUMP, FASTEXPORT, BTEQ, TPT and ARCMAIN.
- Experience with Toad, SQL developer and PL/SQL developer as an interface to databases, to analyze and alter data.
- Experienced at creating effective Test data and development through Unit test cases to ensure successful execution of the data.
- Excellent communication, documentation and presentation skills using tools like Visio and Power Point.
- Closely worked with business users at various stages of the project life cycle, including requirements gathering, design, development, testing and post-production support.
- Health domain exposure for over 7 years and Banking domain exposure for over 4 years.
- Good communication & Presentation skills, Great Work management Skills, Very Good organizational skills, Self-motivated, hardworking, great team Player, intellectually flexible, Quick Learner with short learning Curve and Quick to Adapt.
TECHNICAL EXPERIENCE
Big Data Technologies: HDFS, MapReduce, Hive, Pig, Sqoop, Flume, Oozie, Storm, Zookeeper, Kafka, Impala, HCatalog, Apache Spark, Spark Streaming, SparkSQL, RDD, DataFrames, Python, Hbase, Cassandra, Hcatolog, Beeline
Hadoop Distributions: AWS, Cloudera (CDH3/CDH4/CDH5)
Data Warehousing Tools: Informatica Power Center 10.1/9.6/8.1, Business Objects XI 4.1/3.0/R2/5.1.6, IDQ 9.6.1, DVO 9.6, TDM 9.6, Talend 6.1, Oracle BAM dash boards, Xcelsius
Programming Languages: SQL, PL/SQL, Python, UNIX Shell Scripting, Java, XML, XSD
Database: Oracle 10g, SQL Server 2000/2008, Teradata V2R12, HBase, Cassandra
Operating Systems: Windows XP, Windows 7, Solaris 5.9, Linux 2.6, HP-UX 11i
Tools /Utilities: Toad, SQL developer, PL/SQL developer, Putty, Winscp, HP Quality Center, HP Service Manager, Borland Starteam, IBM uDeploy, Sub version, Tortoise GIT, XML Pad, Rally
Scheduling Tools: ESP, Autosys
EXPERIENCE SUMMARY
Confidential, Chattanooga, TN
Team Lead
Responsibilities:
- Provided ETL Solutions and Technical Leadership and direction to Gateway reporting team by understanding business processes, gathering requirements, identifying potential usability issues, managing scope, and ensuring that and appropriate level of application quality was maintained at all times.
- Participated in requirement gathering with BAs and written technical user stories in Rally.
- Integrating data from relational database through pyspark to extract data from sources and load into HDFS
- Worked with RDD and Dataframes to process the data in spark.
- Accessed Hive tables in spark through Hive context and loaded into the data frames
- Created partitions and Bucketing on the Hive tables to improve perfromance
- Extensively used Parquet format and snappy codec compressions to save storage and improve performance in HDFS
- Designed spark,sqoop,Hive and Impala workflow execution using oozie scheduler.
- worked with sparksql to query data from Hive tables
- Extensively used sqoop to extract data from legacy systems(Teradata,oracle) and load the data into HDFS
- worked on migrating spark scripts,oozie workflows,hive and impala scripts.
- Implemented Slowly changing dimesnion tables in HDFS Devised schemes to collect and stage large data in HDFS and also worked on compressing the data using various formats to achieve optimal storage capacity.
- Experience in working on different file formats like Parquet, Avro etc.
- Did performance tuning by creating partitions and bucketing in Hive.
- Developed Spark code using Python and Spark-SQL for faster testing and data processing.
- Used Spark SQL to process the huge amount of structured data.
- Worked on design, development and enhancement of Informatica mappings, workflows to load data into staging, Operational data store and reporting schemas in Oracle database and Teradata.
- Extensively used Transformations like Router, Aggregator, Joiner, Expression and Lookup, Update strategy, Sequence generator, XML Parser and Stored Procedure.
- Developed code in Agile.
- Identifying performance bottlenecks using Informatica log files, verbose option and then doing performance improvement by using Informatica’s partitioning, sorted input and tuning source sql queries.
- Parameterized the mappings and increased re-usability.
- Enhanced unix shell scripts, PL/SQL procedures and functions.
- Modified Java code to parse XML files with XSD and transfer XML files to .dat files
- Scheduling of informatica jobs to automate ETL process.
- Applied the rules and profiled the source and target table's data using IDQ.
- Written unit test cases in Informatica DVO. Automated few technical unit test scenarios in DVO
- Applied masking rules for few test databases using Informatica TDM.
- Used various transformations like Address validator, parser, joiner, filter, matching to develop the maps
- Performed reverse-engineering from Business objects reports, Informatica codes, unix shell scripts, Java code to understand existing process, data and relationships
- Deployment and Maintenance of code in production environment.
- Monitoring ETL runs sessions and Intake ods notifications alerts for failures.
- Failures are logged in HPSM tickets and analyzing incidents for root cause.
- Analyzing incorrect/Missing data in reports.
- Preparing Root cause analysis (RCA) docs for the issues logged by end users through HPSM incidents.
- Involved in preparing BTEQ, Fast Load scripts to load data in BULK mode to STAGE.
- Developing Business objects reports using webi rich client as per new requirements.
- Developed Reports using Conditional Formatting, Cascading Prompts and Report Bursting.
- Developed List, cross tab, drill through, master-detail, chart and complex reports which involved Multiple Prompts, Filters, multi-page, multi-query reports against multiple databases. Used filters for efficient data retrieval.
- Developed and enhanced BO Universe using Universe Designer
- Planned and designed the oracle schemas, designing database objects like tables, views, partitions, indexes, sequences, constraints, procedures, functions and database triggers.
- Business objects administration on UNIX environment including scheduling, deploying reports in another environments, automatic backups, user access, security and control.
- Maintenance of Business objects repository and server.
Environment: Hadoop, Informatica 8.1.6/9.1.0/10.1,, HDFS, Pig, Spark, Hive, Impala, HBase, Oozie, flume, Sqoop, UNIX, Teradata RDBMS 13.10/14.10, BTEQ, FASTLOAD, MLOAD, XPORT, Teradata SQL Assistant, Oracle 11i/12c, SQL, IDQ 9.6.1, Business objects XI 3.1.6/4.0, Unix, Shell Scripting, SQL, PL/SQL, XML, XSD, Python, Java, Toad, SQL developer, Oracle BAM, Putty, Winscp, HP Quality Center, XML Pad, Solaris 5.10, HP-UX 11i
Confidential, CA
Team Lead
Roles & Responsibilities:
- Participated in requirement gathering starting from the project bid.
- Written user stories on gathered requirements along with BAs.
- Used Informatica Power Center for extraction, transformation and load (ETL) of data in the data warehouse.
- Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.
- Developed complex mappings in Informatica to load the data from XML sources to Oracle database targets.
- Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
- Performed partitions on Informatica targets to increase performance.
- Parameterized the mappings and increased the re-usability.
- Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.
- Extensively used Informatica debugger to figure out the problems in mapping. Also involved in troubleshooting ETL bugs.
- Prepared Design documents to describe program development, logic, coding, testing, changes and corrections.
- Prepared unit test cases and supported system and integration testing
- Followed Informatica recommendations, methodologies and best practices.
- Schedule jobs on a daily/Weekly/Monthly/Yearly using Informatica scheduler.
- Designed and developed Business objects universe and reports in web intelligence
- Enhanced existing PL/SQL procedures as per Business requirements
Environment: Informatica 8.6.1, Business objects XI 3.1, Oracle 10g, PL/SQL, Shell scripting, Unix, XML, XSD, Toad, SQL developer, Windows 7, Solaris 5.9, Linux 2.6
Confidential
Team Lead
Roles & Responsibilities:
- Involved in production support activities
- Analyzing the issues raised by the business in IMS(Issues Management System)
- Fixing the issues and testing the same in Development Environment.
- Updating the issues tracker after fixed.
- Involved in Reporting/ETL enhancements.
- Requirements gathering and doing impact analysis on existing code
- Preparing the Program Spec(PS) and design walkthrough with client
- Developed ETL mappings with oracle, SQL server sources and various target definitions.
- Created sessions and workflows in workflow manager.
- Identified performance bottle necks in Informatica mappings using logs, verbose data and Debugger
- Developed PL/SQL procedures and functions.
- Doing the unit testing and preparing the Integrated Unit Testing (IUT).
- Applied fine tuning methods on poor performing SQL queries
- Scheduled Informatica workflows in Autosys
- Preparing the design document for enhancements.
- Involved in Database support (Oracle 10g).
Environment: Informatica 8.1.6, Oracle 10g, Business objects XI R2, Xcelsius, Crystal Reports, SQL, PL/SQL, Java, Windows XP, Linux 2.6, SQL server, Autosys, Toad, SQL developer, HP Service Manager
Confidential
Sr.Developer
Roles & Responsibilities:
- Resolving issues got in Reports, Universes from clients on priority basis.
- Giving access rights on BO to users
- Maintaining universes.
- Business objects Administration on windows server
- Maintaining Repository in various environments
- Maintaining BO Servers in Windows environment
- Analysis of the specifications provided by the client for enhancements.
- Analyzing the existing database and created Reports.
- Understanding the business logic.
- Preparing the specification document.
- Design and development of BO Universe that are required for generating the reports.
- Creating and formatting the reports in Webi and Desktop intelligence based on the specifications.
- Testing the reports thoroughly before releasing it to production.
- Involved in ETL production support and enhancement activities
- Migrated Universes and reports from oracle 9i to SQL Server 2005 database
Environment: Business Objects 5.1.6, Informatica 7.1, Oracle 9i, SQL Server 2000, Windows server NT
Confidential
Developer
Responsibilities:
- Design and developed Business objects universe and Desktop intelligence reports.
- Prepared technical design documents.
- Provided design walkthrough
- Performed unit testing on BO universe and reports
- Provided resolution for the defects raised by testing team on BO reports
Environment: Business objects 5.1.6, SQL, Oracle 8i, Toad, Windows NT
