Technical Lead Resume
PROFESSIONAL SUMMARY:
- I am associated with IT industry from past 9+ Years across US and India locations.
- I am very keen in learning new things and be part of something great, i am very patient and quick in learning, i am looking forward for opportunities as ETL lead/Technical Manager role or new opportunities in big data / hadoop/ Cloud Analytics.
- Around 9 years of IT industry experience encompassing a wide range of skill set, roles and industry verticals - Manufacturing, Retail, Insurance and Energy Utilities industry.
- Extensive experience with Business transformation and Integration, Data warehouse Design and Migration projects serving different roles like Technical lead, Module lead etc.
- Proficient in analyzing and translating business requirements to technical requirements and architecture and estimations.
- Strong database skills, ETL Solution Design and implementation knowledge with ETL Tool IBM’s Information Server Datastage on several versions 8.1, 8.5, 9.1, 11.3.
- Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Performance Tuning.
- Experienced in Client facing Roles including interacting with clients on Requirements gathering, Project issues and status reporting,
- Working with Client technical leads/Architects in implementing Design research and doing POCs for new implementations.
- Experience with POC on Cloudera Hadoop - Performed POCs for Historical Loads of Hive tables from Teradata source using Sqoop.
- Hadoop POC for using Hive and Impala scripting for doing datatype validations of input source files.
- Hadoop POC on Generic Code to read XML input files and load data into Hive.
- Experience in using and writing sqls for Oracle, DB, Teradata & SQL Server databases and Unix Shell Scripting.
- Extensive experience in working with all business stake holders in the project and getting signoffs of the project.
- Extensive experience in configuration management, creating deployment plan, roll back plan and post deployment plan.
- Good communication skills, interpersonal skills, self-motivated, quick learner, team player.
SKILL:
BI ETL: ETL Solutions, Dataware house design, Dimension Modelling, Performance Tuning of system, Reporting framework
Hadoop: Cloudera Hadoop, Hive, Impala, Shell Scripting
Databases: IBM Db2 9.x, Oracle, Multidimensional Cubes, MQTs, Partitioning, Query Tuning, InformixLanguages Sql, Unix, Shell Scripting, C, C++
Tools: Datastage 8.x, 9.1,11.1, Cognos, IBM M1, VSS, Clarify, Remedy, Datastudio and Dbvisualizer. IBM Warehouse, IBM DB2 Cubing Services, Informatica Power Exchange, Autosys, Service Now, Starteam.
O/S: Windows NT/2000 Server, AIX Unix, Linux
EAI: Webshphere Message Broker, IBM Information server
Processes: QMS (Quality Management System), Agile Methdology, SDLC
PROFESSIONAL EXPERIENCE:
Confidential
Technical Lead
Responsibilities:
- Design, Develop and Test ETL processes working alongwith Clients technical and business teams.
- Monitor performance of Datastage jobs identify and fix performance issues where required.
- Interact with Business team and Business analysts for the New Project Work intakes.
- Work with business and client technical leads to gather Requirements and do Effort estimations.
- Create HLD & LLD documents and review them with client team.
- Enforce standards among all users of application to ensure uniformity.
- Conducted design review, code review & test result review with client/Business team.
- Create the deployment, Rollback & Post Deployment support plans for the project.
Environment: Datastage 8.1/11.3, Linux, Oracle 12.1, Tivoli workload Scheduler
Data Specialist
Confidential
Responsibilities:
- Interact with Client Managers for the New Project Work intakes.
- Setup and initiate Project Kickoff with Business teams.
- Work with business and client technical leads to gather Requirements and do Effort estimations.
- Work with Client Architects and Technical Leads in performing POCs on new Analytics tools.
- Worked on Various Hadoop POCs .
- Performed POCs for Historical Loads of Hive tables from Teradata source using Sqoop.
- Cloudera Hadoop POC for using Hive and Impala scripting for doing datatype validations of input source files.
- Hadoop POC on Generic Code to read XML input files and load data into Hive.
- Create HLD & LLD documents and review them with client team.
- Enforce standards among all users of application to ensure uniformity.
- Conducted design review, code review & test result review with client/Business team.
- Work with program team to get the signoffs for the project deployment.
- Create the deployment, Rollback & Post Deployment support plans for the project.
- Conduct KT to Production Support team.
Environment: Cloudera Hadoop, Hive, Impala, Hue, Websphere Datastage 8.1/11.3, Linux, Oracle 10g, Teradata, Autosys
Confidential
ETL Data Specialist
Responsibilities:
- Worked as a Support Lead to provide application maintenance and support to Oncor DWH system.
- Worked on multiple tools in the area of data warehousing and business intelligence like Datastage 8.x, Informatica Power Exchange, Informix, Oracle and Cognos 8.4.
- Lead a team of 4 for implementation of Informatica Power exchange.
- Responsible for automation of multiple manual tasks by writing unix scripts.
- Involved in performance tuning and code enhancements to achieve better performance for various processes.
- Involved in sharing application knowledge with various development teams and assisted them in deployments/defect fixes.
- Responsible for effective communication within the team and with the client.
- Provide 24x7 application maintenance and support to Oncor’s business critical application, quick resolution to high priority incidents, enhancements of applications and operating environments to align them with latest Business requirements, application monitoring and control
Environment: Websphere Datastage 8.5, Informatica Power Exchange, Informatica8.1, Informix 11.70, Oracle 10g, Solaris 5, AIX
Confidential
ETL Data Specialist
Responsibilities:
- Responsible for leading a team in delivering solution to our customer in the Insurance sector.
- Deliver new and complex high quality solutions to clients in response to varying business requirements
- Responsible for managing scope, Analysis, Designing solution for various aspects of the project.
- Responsible for translating customer's requirement into Technical solutions and then implementation of the same.
- Translate customer requirements into formal requirements and design documents, establish specific solutions, and leading the efforts including programming and testing that culminate in client acceptance of the results.
- Utilize in-depth knowledge of functional and Technical experience in ETL designs and Database Migration systems and other leading-edge products and technology in conjunction with industry and business skills to deliver solutions to customer.
- Establish Quality Procedure for the team and continuously monitor and audit to ensure team meets quality goals.
Environment: Websphere Datastage 8.1, Linux, Oracle, DB2 9.5
IT Analyst
Confidential
Responsibilities:
- Worked on all phases of data warehouse development lifecycle, from gathering requirements to testing, implementation, and support.
- Proficient in dimensional modeling. Design of Conformed facts, Conformed dimensions and Master data management, Data profiling, Data cleaning and Data Integration. Slowly Changing Dimension (SCD) implementation, surrogate key assignment, Design of job control techniques and Incremental load criteria, design of data marts with degenerate dimensions and Reconciliation of data.
- Have been referring Ralf Kimball and Inmon methods in previous projects. Worked with 3NF Data warehouse with Top-down design approach.
- Have Implemented CDC design and real-time/Active data warehouse.
- 4.5 years of wide experience in Development, configuration management, support activities of a data warehouse having multiple source systems including XML message queue (MQ),
- Source data analysis and identified business rules for data migration and for developing data warehouse.
- Involved in identifying the data inputs from various source systems depending upon the business requirement.
- Involved in High-level design in Implementation of different ETL Layer and Structure, scheduling, Dimensional Modeling.
- Involved in low-level design and developed various jobs and performed data loads and transformations using different stages of DataStage and pre-built routines, functions.
- Team player with excellent communication and problem solving skills.
- 2 Years onsite experience with client interaction and co-ordination.
- Expert in Development of ETL Jobs and sequences in Datastage 8.1 and 8.5.
- Got experience in setting up and Administration of Datastage projects and user accounts.
- Automation of Datastage health check, performance tuning process, Datastage backup process, Datastage services startup during failure.
- Working with Message Queues and Distributed transaction stage.Managing ENV variable, Setup and Managing the ODBC connections.
- Experience in Performance tuning of Datastage jobs.
- Performance tuning of ETL jobs.
- Expert in ETL jobs design with Database stages, Message queues, XML stages, lookup, join, Sort, Remove duplicate, Aggregator, Funnel, Transformer, Configuration file, Restartable Sequences with execute command, mail notification and exception handling, stored procedure stages, datasets and sequential files.
- Extensively used UNIX shell scripts in all the projects for automation of various activities. Scripts to trigger various ETL activities after dependency checks.
- Automated the process to trigger any cognos report using ETL trigger via datastage.
- Automated the capturing of ETL performance parameters like records processed, time Elapsed, start time, Finis Time etc. This performance data is helpful in performance tuning and to resolve the bottleneck on resource utilization.
- Automated the FTP processes. Scripts to update the ETL job control tables.
- Scripts to execute Cognos cubes, scratch space clean up etc.
- Data reconciliation.
- Used DataStage Designer for developing various jobs to extract, cleansing, transforming, integrating and loading data into Data Warehouse.
- Parameterized DataStage jobs to allow portability and flexibility during runtime.
- Preparation of Design Documents / Test Case Specifications / Performance Review and Coding and getting it signed off from client.
- Design /Test Case Specifications / Performance Review and Coding of Shared Container and Reusable jobs.
- Complex queries are written to facilitate the supply of data to verify Project artifacts results.
Environment: AIX, Oracle 9i, DB2 9.5, Infosphere Datastage 8.1,8.5,9.1, Cognos 8,10, Unix