We provide IT Staff Augmentation Services!

Technology Lead/data Engineer Resume

SUMMARY

  • 14 years of IT experience with extensive knowledge in Software Development Life Cycle (SDLC) involving Requirements Gathering, Analysis, Design, Development, Testing, Implementation, and Maintenance.
  • Research, evaluate, identify alternative approaches to support development needs
  • Recommend, design and code efficient and effective solutions for challenging problems for medium to large work efforts of medium to high complexity
  • Comply with standards and guidelines related to the design, construction, testing, and deployment activities within the Delivery Management environments
  • Demonstrate collaborative skills working within a project team of diverse skills
  • Bring strong communication skills including oral, written and presentation skills plus creativity, and problem - solving skills to a challenging environment
  • Extensive knowledge in Development, Analysis and Design of ETL methodologies in all the phases of Data Warehousing life cycle.
  • Extensive experience working with DataStage Designer and various Stages of DataStage Parallel Extender.
  • Extensive use of DataStage client components - DataStage Director, DataStage Designer, DataStage Administrator.
  • Experience in working with various data sources like Sequential files, Mainframe files (COBOL copy books), Oracle, DB2, and Teradata.
  • Working knowledge on different databases like TERADATA, ORACLE, and DB2.
  • Experience working on various Teradata utilities like FLOAD, MLOAD, BTEQ and TPUMP.
  • Working knowledge on version control using clear case and CA SCM.
  • Working experience on various stages in DataStage like Transformer, sequential file, dataset, join, lookup, funnel, modify, etc… to transform the data and load it into data warehouse.
  • Experience working in AGILE and SCRUM methodologies for development and performance tuning.
  • Experience in using UNIX and IBM AIX commands and writing UNIX shell scripts.
  • Expertise in Data integration and migration.
  • Exposure to Bigdata concepts.
  • Development, design, testing and implementation of ETL/Data Movement in support of a large-scale Data Warehouse and other Enterprise projects.
  • Develop, test and implement modifications to existing ETL/Data Movement for business and technical changes, fixes and enhancements.
  • Served as a source for ETL/Data Movement knowledge, collaborating with business analysts and business users to solve analytical, statistical, and reporting needs in support of new or changed ETL/Data Movement.
  • Understand performance implication of ETL design and be able to diagnose and troubleshoot performance/configuration problems .
  • Worked directly with other IT teams to acquire, create, transmit, and/or transform data for projects.

TECHNICAL SKILLS

Databases/RDBMS: Snowflake, Oracle 11g/12c/Exadata, Teradata 13/14/15, SQL Server, DB2, SYBASE/Facets, MySQL, MongoDB

Operating System: RHEL, UNIX Sun Solaris, AIX, HP and Windows

Programming Languages: Java, JavaScript, HTML

ETL Technologies: IBM Infosphere DataStage and QualityStage 11.7/11.5/9.1/8.7 , Ab-Initio, Matillion

Data Visualization: Power BI, Tableau, OBIEE

Cloud Technologies: Azure

Scripting Languages: Shell Scripts

Scheduling Tools: AUTOSYS, CA7, TWS

Functional Knowledge: Retail, Banking and Finance, Capital Markets, Insurance, Telecom and Health care.

Other Tools: Altova XMLSpy, XML/XSD, Jenkins, JIRA, Microsoft Office Tools, MS-Visio, Teradata SQL Assistant, JSON, REST, SOAP, Clear case, SVN, GIT, Aqua Data, TOAD, Service Now, etc…

PROFESSIONAL EXPERIENCE

Confidential

Technology Lead/Data Engineer

Responsibilities:

  • Involved in architecting the data integration and lead the track for implementing Big Data analytics solution.
  • Providing the solutions to the integration teams across the line of business.
  • Working in AGILE and SCRUM methodologies for development and performance tuning.
  • Involved in requirements gathering, Analysis, Design, Develop and maintenance Phases.
  • Worked closely with Analysts and business users to gather the requirements and business rules.
  • Worked on data migration from traditional data warehouse to Snowflake.
  • Worked on loading the flat files/XML/JSON formats and Oracle database to Snowflake.
  • Created pipelines to import data into Snowflake stage.
  • Managed/Mentor team of developers to design, implement and test solutions.
  • Salesforce Connector integration with DataStage.
  • Created/developed ETL jobs to load the data in to Azure Datalake using ADL stage.
  • Design/Develop DataStage ETL jobs to process the data into Salesforce.
  • Designed/Developed reusable ETL process to load the stage tables from EDI
  • Involved in data modeling for EDW projects.
  • Supported End to End monitoring of the application flows for Data warehousing projects.
  • Developed/created mongoDB CRUD operations.
  • Developed/created mongoDB Aggregation pipelines. Design indexing strategies, Configure, monitor, and deploy replica sets.
  • Creating/developing data mappings using Map Force.
  • Jenkins administration - view creations, user access provisioning, continuous Integration, Trouble shooting.
  • Create/develop the reports/dashboards using Power BI for data visualization.
  • Extensively worked on DataStage project creation and configuration.
  • Extensively worked on user creation and troubleshooting of DataStage environments.
  • DataStage Administration - Connector configuration, Plugins and PMR management.
  • Resolving issues assigned through service Now incidents and provide the permanent solution to the bugs.
  • Performance tuning of ETL DataStage jobs where the data flow is huge in size and needs parallel processing.
  • Change coordination for the deployment of ETL/Shell Scripting/Database SQL scripts into higher environments.
  • On call support for the applications developed and coordinating with offshore and onshore teams in resolving the issues.

Environment: IBM Infosphere DataStage & QualityStage 11.7/11.5/8.7 , Oracle 12c/11g, DB2, Shell Scripts, UNIX/LINUX, TWS, IIB10, Jenkins, GIT, SVN, SPLUNK, JIRA, Salesforce.com, XML Spy, Service Now, Windows NT/XP, etc…

Confidential

Senior ITDS Architect/Sr. Data Analyst

Responsibilities:

  • Involved in requirements gathering, Analysis, Design, Develop and maintenance Phases.
  • Worked closely with Analysts and business users to gather the requirements and business rules.
  • Developed/Created source to target mapping documents and technical specification documents.
  • Developed/Created generic DataStage job to load the stage tables using dynamic SQL and UNIX scripts.
  • Created/Generated SHCA XML for regulatory reports from spreadsheets which will be submitted to Federal Bank.
  • Developed/Created ETL for CAMRA to EPAM conversion project.
  • Involved in architecting the data integration and lead the track for implementing Big Data analytics solution.
  • Integrated 6 data sources in to one consolidated cohesive.
  • Initiated data governance program for Product Lifecycle Engineering data establishing procedures for maintaining high data quality and resolving all data issues
  • Project maintenance and support for the end users and business analysts to ensure the application is working as expected.
  • Working closely with operations team to fix the defects and provide the solution.
  • Working in AGILE and SCRUM methodologies for development and performance tuning.
  • Successfully Integrated data across multiple and high volumes of data sources and target applications.
  • Maintained the metadata and business glossary using the IBM Information Governance Catalog (IGC)
  • Worked on the Manhattan IWMS project with Trimble for HR Analytics
  • Managed/Mentor team of developers to design, implement and test solutions
  • Working experience on OFSDF (Oracle Financial Services Data Foundation)
  • Participated in On-call support for OFSDF project.

Environment: IBM Infosphere DataStage & QualityStage 11.5, IGC, Fast Track, Oracle 11g, OBIEE, Shell Scripts, UNIX/LINUX, Windows NT/XP, HP ALM, AUTOSYS R11, RTC Scrum, MS - VISIO etc…

Confidential

Lead ETL Developer

Responsibilities:

  • Involved in requirements gathering, Analysis, Design, Develop and maintenance Phases.
  • Worked closely with Data Modelers and Architects to gather the requirements and business rules.
  • Created low level technical documents based on the requirements.
  • Developed ETL jobs using DataStage to populate the target Oracle dimensions and facts.
  • Mentor the junior developers and them to get familiarity with the system.
  • Successfully Integrated data across multiple (Lawson, SSMS, Rx, etc…) and high volumes of data sources and target applications.
  • Automation of ETL processes using DataStage Job Sequencer and ESP scheduler tool to schedule the jobs.

Environment: IBM Infosphere DataStage & QualityStage 9.1, Teradata14, Oracle 11g, Shell Scripts, UNIX/LINUX, Windows NT/XP, MS - VISIO etc…

Confidential

Sr. Data Specialist

Responsibilities:

  • Involved in requirements gathering, Analysis, Design, Develop and maintenance Phases.
  • Worked closely with Business Analysts and Architects to gather the requirements and business rules.
  • Developed/Created source to target mapping documents and technical specification documents.
  • Creating Technical Design documents (TDD).
  • Take higher responsibilities to lead the team and deliver the good quality of code.
  • Review the code and suggest the modifications/changes.
  • Resolving Defects and issues in production environment and bug fixing.
  • Project maintenance and support for the end users and business analysts to ensure the application is working as expected.
  • Working closely with operations team to fix the defects and provide the solution.
  • Working in AGILE and SCRUM methodologies for development and performance tuning.
  • Designed parallel jobs using various stages like Join, Transformer, Sort, Merge, Filter and Lookup, Sequence, Modify, Peek etc. stages.
  • Broadly involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using DataStage PX.
  • Developed UNIX shell scripts to manipulate the data.
  • Successfully Integrated data across multiple and high volumes of data sources and target applications.
  • Automation of ETL processes using DataStage Job Sequencer and AUTOSYS scheduler tool to schedule the jobs.
  • Leading the team of 8 people in offshore and onsite.

Environment: IBM Infosphere DataStage & QualityStage 8.5, TERADATA 14, Oracle 11G, Shell Scripts, UNIX/LINUX, Windows NT/XP, Anthill pro, CA7, MS - VISIO etc…

Confidential

Senior DataStage Consultant

Responsibilities:

  • Involved in requirements gathering, Analysis, Design and Development Phases.
  • Worked closely with Business Analysts and Architects to gather the requirements and business rules.
  • Developed/Created source to target mapping documents and technical specification documents.
  • Obtained detailed understanding of data sources, Flat files and Complex Data Schema.
  • Designed parallel jobs using various stages like Join, Transformer, Sort, Merge, Filter and Lookup, Sequence, Modify, Peek etc. stages.
  • Broadly involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using DataStage PX.
  • Successfully Integrated data across multiple and high volumes of data sources and target applications.
  • Automation of ETL processes using DataStage Job Sequencer and AUTOSYS scheduler tool to schedule the jobs.

Environment: IBM Infosphere DataStage & QualityStage 8.5, TERADATA 13, Shell Scripts, Sun-UNIX, Windows NT/XP, CA SCM, AUTOSYS R11 etc…

Hire Now