Technology Lead/data Engineer Resume
Greenville, SC
SUMMARY:
- 14 years of IT experience wif extensive knowledge in Software Development Life Cycle (SDLC) involving Requirements Gathering, Analysis, Design, Development, Testing, Implementation, and Maintenance.
- Research, evaluate, identify alternative approaches to support development needs
- Recommend, design and code efficient and effective solutions for challenging problems for medium to large work efforts of medium to high complexity
- Comply wif standards and guidelines related to teh design, construction, testing, and deployment activities wifin teh Delivery Management environments
- Demonstrate collaborative skills working wifin a project team of diverse skills
- Bring strong communication skills including oral, written and presentation skills plus creativity, and problem - solving skills to a challenging environment
- Extensive knowledge in Development, Analysis and Design of ETL methodologies in all teh phases of Data Warehousing life cycle.
- Extensive experience working wif DataStage Designer and various Stages of DataStage Parallel Extender.
- Extensive use of DataStage client components - DataStage Director, DataStage Designer, DataStage Administrator.
- Experience in working wif various data sources like Sequential files, Mainframe files (COBOL copy books), Oracle, DB2, and Teradata.
- Working knowledge on different databases like TERADATA, ORACLE, and DB2.
- Experience working on various Teradata utilities like FLOAD, MLOAD, BTEQ and TPUMP.
- Working knowledge on version control using clear case and CA SCM.
- Working experience on various stages in DataStage like Transformer, sequential file, dataset, join, lookup, funnel, modify, etc… to transform teh data and load it into data warehouse.
- Experience working in AGILE and SCRUM methodologies for development and performance tuning.
- Experience in using UNIX and IBM AIX commands and writing UNIX shell scripts.
- Expertise in Data integration and migration.
- Exposure to Bigdata concepts.
- Development, design, testing and implementation of ETL/Data Movement in support of a large-scale Data Warehouse and other Enterprise projects.
- Develop, test and implement modifications to existing ETL/Data Movement for business and technical changes, fixes and enhancements.
- Served as a source for ETL/Data Movement knowledge, collaborating wif business analysts and business users to solve analytical, statistical, and reporting needs in support of new or changed ETL/Data Movement.
- Understand performance implication of ETL design and be able to diagnose and troubleshoot performance/configuration problems .
- Worked directly wif other IT teams to acquire, create, transmit, and/or transform data for projects.
TECHNICAL SKILLS:
Databases/RDBMS : Snowflake, Oracle 11g/12c/Exadata, Teradata 13/14/15, SQL Server, DB2, SYBASE/Facets, MySQL, MongoDB, Postgres SQL
Operating System: RHEL, UNIX Sun Solaris, AIX, HP and Windows
Programming Languages: Java, JavaScript, HTML
ETL Technologies: IBM Infosphere DataStage and QualityStage 11.7/11.5/9.1/8.7, Ab-Initio, Informatica 9.x, Matillion
Data Visualization: Power BI, Tableau, OBIEE
Scripting Languages: Shell Scripts
Scheduling Tools: AUTOSYS, CA7, TWS
Functional Knowledge: Retail, Banking and Finance, Capital Markets, Insurance, Telecom and Health care.
Other Tools: Kafka, Altova XMLSpy, XML/XSD, Jenkins, JIRA, Microsoft Office Tools, MS-Visio, Teradata SQL Assistant, JSON, REST, SOAP, Clear case, SVN, GIT, Aqua Data, TOAD, Service Now, etc…
PROFESSIONAL EXPERIENCE:
Confidential, Greenville, SC
Technology Lead/Data Engineer
Responsibilities:
- Involved in architecting teh data integration and lead teh track for implementing Big Data analytics solution.
- Providing teh solutions to teh integration teams across teh line of business.
- Working in AGILE and SCRUM methodologies for development and performance tuning.
- Involved in requirements gathering, Analysis, Design, Develop and maintenance Phases.
- Worked closely wif Analysts and business users to gather teh requirements and business rules.
- Used Salesforce.com to analyze teh customer sales orders and prospect customer orders.
- Worked on data migration from traditional data warehouse to Snowflake.
- Worked on loading teh flat files/XML/JSON formats to Snowflake
- Created pipelines to import data into Snowflake stage.
- Managed/Mentor team of developers to design, implement and test solutions.
- Salesforce Connector integration wif DataStage.
- Design/Develop DataStage ETL jobs to process teh data into Salesforce portals.
- Designed/Developed reusable ETL process to load teh stage tables from EDI
- Involved in data modeling for EDW projects.
- Supported End to End monitoring of teh application flows for Data warehousing projects.
- Creating/developing data mappings using Map Force.
- Jenkins administration - view creations, user access provisioning, continuous Integration, Trouble shooting.
- Create/develop teh reports/dashboards using Power BI for data visualization.
- Extensively worked on DataStage project creation and configuration.
- Extensively worked on user creation and troubleshooting of DataStage environments.
- DataStage Administration - Connector configuration, Plugins and PMR management.
- Resolving issues assigned through service Now incidents and provide teh permanent solution to teh bugs.
- Performance tuning of ETL DataStage jobs where teh data flow is huge in size and needs parallel processing.
- Change coordination for teh deployment of ETL/Shell Scripting/Database SQL scripts into higher environments.
- On call support for teh applications developed and coordinating wif offshore and onshore teams in resolving teh issues.
Environment: IBM Infosphere DataStage & QualityStage 11.7/11.5/8.7, Oracle 12c/11g, DB2, Shell Scripts, UNIX/LINUX, TWS, IIB10, Jenkins, GIT, SVN, SPLUNK, JIRA, Salesforce.com, XML Spy, Service Now, Windows NT/XP, etc…
Confidential, Charlotte, NC
Senior ITDS Architect/Sr. Data Analyst
Responsibilities:
- Involved in requirements gathering, Analysis, Design, Develop and maintenance Phases.
- Worked closely wif Analysts and business users to gather teh requirements and business rules.
- Developed/Created source to target mapping documents and technical specification documents.
- Developed/Created generic DataStage job to load teh stage tables using dynamic SQL and UNIX scripts.
- Created/Generated SHCA XML for regulatory reports from spreadsheets which will be submitted to Federal Bank.
- Developed/Created ETL for CAMRA to EPAM conversion project.
- Involved in architecting teh data integration and lead teh track for implementing Big Data analytics solution.
- Integrated 6 data sources in to one consolidated cohesive.
- Initiated data governance program for Product Lifecycle Engineering data establishing procedures for maintaining
- Project maintenance and support for teh end users and business analysts to ensure teh application is working as expected.
- Working closely wif operations team to fix teh defects and provide teh solution.
- Working in AGILE and SCRUM methodologies for development and performance tuning.
- Successfully Integrated data across multiple and high volumes of data sources and target applications.
- Maintained teh metadata and business glossary using teh IBM Information Governance Catalog (IGC)
- Worked on teh Manhattan IWMS project wif Trimble for HR Analytics
- Managed/Mentor team of developers to design, implement and test solutions
- Working experience on OFSDF (Oracle Financial Services Data Foundation)
- Participated in On-call support for OFSDF project.
Environment: IBM Infosphere DataStage & QualityStage 11.5, IGC, Fast Track, Oracle 11g, OBIEE, Shell Scripts, UNIX/LINUX, Windows NT/XP, HP ALM, AUTOSYS R11, RTC Scrum, MS - VISIO etc…
Confidential, Phoenix, AZ
Lead ETL Developer
Responsibilities:
- Involved in requirements gathering, Analysis, Design, Develop and maintenance Phases.
- Worked closely wif Data Modelers and Architects to gather teh requirements and business rules.
- Created low level technical documents based on teh requirements.
- Developed ETL jobs using DataStage to populate teh target Oracle dimensions and facts.
- Mentor teh junior developers and them to get familiarity wif teh system.
- Successfully Integrated data across multiple (Lawson, SSMS, Rx, etc…) and high volumes of data sources and target applications.
- Automation of ETL processes using DataStage Job Sequencer and ESP scheduler tool to schedule teh jobs.
Environment: IBM Infosphere DataStage & QualityStage 9.1, Teradata14, Oracle 11g, Shell Scripts, UNIX/LINUX, Windows NT/XP, MS - VISIO etc…
Confidential, Bentonville, AR
Sr. Data Specialist
Responsibilities:
- Involved in requirements gathering, Analysis, Design, Develop and maintenance Phases.
- Worked closely wif Business Analysts and Architects to gather teh requirements and business rules.
- Developed/Created source to target mapping documents and technical specification documents.
- Creating Technical Design documents (TDD).
- Take higher responsibilities to lead teh team and deliver teh good quality of code.
- Review teh code and suggest teh modifications/changes.
- Resolving Defects and issues in production environment and bug fixing.
- Project maintenance and support for teh end users and business analysts to ensure teh application is working as expected.
- Working closely wif operations team to fix teh defects and provide teh solution.
- Working in AGILE and SCRUM methodologies for development and performance tuning.
- Designed parallel jobs using various stages like Join, Transformer, Sort, Merge, Filter and Lookup, Sequence, Modify, Peek etc. stages.
- Broadly involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using DataStage PX.
- Developed UNIX shell scripts to manipulate teh data.
- Successfully Integrated data across multiple and high volumes of data sources and target applications.
- Automation of ETL processes using DataStage Job Sequencer and AUTOSYS scheduler tool to schedule teh jobs.
- Leading teh team of 8 people in offshore and onsite.
Environment: IBM Infosphere DataStage & QualityStage 8.5, TERADATA 14, Oracle 11G, Shell Scripts, UNIX/LINUX, Windows NT/XP, Anthill pro, CA7, MS - VISIO etc…
Confidential, Minneapolis, MN
Senior DataStage Consultant
Responsibilities:
- Involved in requirements gathering, Analysis, Design and Development Phases.
- Worked closely wif Business Analysts and Architects to gather teh requirements and business rules.
- Developed/Created source to target mapping documents and technical specification documents.
- Obtained detailed understanding of data sources, Flat files and Complex Data Schema.
- Designed parallel jobs using various stages like Join, Transformer, Sort, Merge, Filter and Lookup, Sequence, Modify, Peek etc. stages.
- Broadly involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using DataStage PX.
- Successfully Integrated data across multiple and high volumes of data sources and target applications.
- Automation of ETL processes using DataStage Job Sequencer and AUTOSYS scheduler tool to schedule teh jobs.
Environment: IBM Infosphere DataStage & QualityStage 8.5, TERADATA 13, Shell Scripts, Sun-UNIX, Windows NT/XP, CA SCM, AUTOSYS R11 etc…
Confidential, Minneapolis, MN
Senior DataStage Consultant
Responsibilities:
- Worked extensively on Data Warehousing E-R Modeling, extensively used DataStage an ETL tool to design mappings to move data from Source to Target database-using Stages.
- Designed parallel jobs using various stages like Join, Transformer, Sort, Merge, Filter, Join, Lookup, Sequence, Modify, Peek, XML etc… stages.
- Developed star schema data model using suitable dimensions and facts.
- Broadly involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using DataStage PX.
- Extensively used DataStage Manager, Designer, and Director for creating and implementing jobs.
- Successfully Integrated data across multiple and high volumes of data sources and target applications.
- Experience in code migration from DataStage 7.5 to DataStage 8.5.
- Experience in code migration from Dev/SIT/UAT/PROD.
- Extensively used Oracle Enterprise stage as source and target Database.
- Extracted teh data from flat files & database, applied ETL logic and loaded teh data to target oracle database.
- Worked on XML stages to transform teh XML data and generated text files.
- Worked on custom SQL queries and procedures for custom coding in teh jobs.
- Removed Duplicate data or filter from teh source system by using Remove Duplicates, Transformer stages to enhance high quality processing of teh incoming data.
- Worked on source data analysis to verify Quality of teh data.
- Worked on Performance tuning of teh jobs and Oracle database queries.
Environment: IBM Infosphere DataStage & Qualitystage V8.5/7.5.x, Oracle 11g, Shell Scripts, Sun-UNIX, Windows NT/XP, Aqua Data, TOAD, MS - VISIO.
Confidential, Minneapolis, MN
Senior Analyst
Responsibilities:
- Worked extensively on Data warehousing, extensively used DataStage an ETL tool to design mappings to move data from Source to Target database-using Stages.
- Obtained detailed understanding of data sources, Flat files and Complex Data Schema.
- Designed parallel jobs using various stages like Aggregator, Join, Transformer, Sort, join, Funnel, Filter and Lookup, Sequence, Modify, Peek etc. stages.
- Worked on different data sources like Flat files, Mainframe files (COBOL copy books).
- Developed star schema data model using suitable dimensions and facts.
- Broadly involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using DataStage PX.
- Extensively used DataStage Manager, Designer, Administrator, and Director for creating and implementing jobs.
- Used Autosys to schedule and monitor job flows.
- Experience in migration of teh Project and teh scripts from Development server to TEST and teh Production server using Clear case baselines.
- Experience in code migration from Dev/SIT/UAT/PROD.
- To reduce teh response time, aggregated teh data, data conversion and cleansed teh large chunks of data in teh process of transformation.
- Created Multiple Instance Jobs theirby reducing redundancy in job design.
- Involved in creating technical documentation for source to target mapping procedures to facilitate better understanding of teh process and in corporate changes as and when necessary.
- Successfully Integrated data across multiple and high volumes of data sources and target applications.
- Extensively used DataStage Director for Job Scheduling, emailing production support for troubleshooting from LOG files.
- Experience working on Teradata utilities like BTEQ, FASTLOAD, MLOAD, and FEXPORT.
- Created UNIX shell scripts to invoke DataStage jobs and pre/post processing files.
- Imported metadata, table definitions and stored procedure definitions using teh Manager.
- Involved in Performance Tuning on teh source and target Confidential DataStage Level and Data Loading.
- Performed unit testing of all monitoring jobs manually and monitored teh data to see whether teh data is matched.
- Strictly followed teh change control methodologies while deploying teh code from DEV to QA and Production
- Assisted both teh DEV and Testing OFFSHORE teams.
- Involved in 24/7 ETL Production Support, maintenance, troubleshooting, problem fixing and ongoing enhancements to teh Data mart.
Environment: IBM Infosphere DataStage 8.1, Ab-Initio Co>op & GDE 3.0.4.2, SQL, TERADATA 13, Shell Scripts, Sun-UNIX, Windows NT/XP, Clear Case, AUTOSYS R11. Harvest, Anthill 3.0
Confidential, Dallas, TX
ETL Developer
Responsibilities:
- Worked wif DataStage Designer to develop jobs for Extracting, Transforming and Loading teh data into Target database.
- Modified Existing ETL scripts to fix defects Confidential teh root cause as enhancement part of project.
- Debugging and Scheduling DataStage Jobs/DataStage Mappings and monitoring error logs.
- Involved in extracting teh data from different source systems like flat files and Oracle Tables
- Daily monitored teh DataStage jobs and executed jobs on an on-demand basis in case of trouble tickets
- Coordinated wif onsite teams for further support and issue resolution.
Environment: IBM Infosphere DataStage 7.x, Oracle 9i, SQL, PL/SQL, Shell Scripts, AIX-UNIX, Windows XP.
Confidential, Princeton, NJ
Application Programmer
Responsibilities:
- Extensively worked wif DataStage Designer for developing medium and complex parallel jobs.
- Extracted teh data from flat files & database, transformed and loaded teh data to target oracle database.
- Used Data Stage Manager to import and export teh project components.
- Worked on custom SQL queries and procedures for custom coding in teh jobs.
- Removed Duplicate data or filter from teh source system by using Remove Duplicates, Transformer stages to enhance high quality processing of teh incoming data.
- Performed ETL processes to extract, Transform And load teh data to target.
- Worked on source data analysis to verify Quality of teh data.
- Scheduled AUTOSYS jobs, to run in a schedule to populate data to target database, also Maintained log data.
- Involved in Unit testing & prepared teh unit test plan documents.
- Used teh UNIX Environment to Execute teh Data Stage jobs.
- Worked on Performance tuning of teh jobs.
- Daily monitored teh DataStage jobs and executed jobs on an on-demand basis in case of issues.
Environment: DataStage 7.5.x (Designer, Director, Manager, Administrator), SYBASE 12.5, SQL, PL/SQL, Shell Scripts, AUTOSYS, Sun-UNIX, Windows NT/XP.