We provide IT Staff Augmentation Services!

Techno Functional Lead Developer Resume

3.00/5 (Submit Your Rating)

MI

SUMMARY:

  • Over 7 years of experience in Data Warehousing which include working onsite in USA.
  • At present, I am engaged as Lead Data Warehouse developer.
  • Work with Business Analyst, Data Architects, Research Analysts and ETL developers.
  • Have good working experience in tools like IBM Infosphere DataStage (11.3), Teradata v12,13,14,15 and UNIX scripting.

TECHNICAL SKILLS:

  • IBM Infosphere DataStage (v8.5, 11.1, 11.3), Teradata v14 (Bteq, Tpt, Fload, Mload)
  • SQL Server
  • UNIX Shell Scripting
  • DB2
  • Tableau
  • Informatica MDM, Python, Hadoop
  • Code deployment tools
  • TFS and Anthill Pro
  • Scheduling tools
  • IBM Tivoli Workload Scheduler.

PROFESSIONAL EXPERIENCE:

Confidential, MI

Techno functional lead developer

Responsibilities:

  • Translate business requirements to technical specifications using IBM Infosphere DataStage, UNIX, IBM DB2 and Teradata v14.
  • Involved in building dashboards using Tableau desktop.
  • Analyze functional and non - functional requirements for the project.
  • Involve in requirement definition sessions.
  • Perform technical Analysis.
  • Develop application solution architecture and high and low-level design.
  • Create test plan, test scenarios and create unit test scripts.
  • Develop Extract, Transform, Load jobs and scheduling scripts using Autosys.
  • Prepare test environment and test data.
  • Involve in all phases of Software Development Lifecycle like Design, Development, Build, Unit Test, System Integration Test support, User Acceptance Test support, Production deployment and post production support.

Confidential, MI

Techno functional lead developer

Responsibilities:

  • Translate business requirements to technical specifications using IBM Infosphere DataStage, UNIX, IBM DB2 and Oracle.
  • Analyze functional and non-functional requirements for the project.
  • Involve in requirement definition sessions.
  • Perform technical Analysis.
  • Develop application solution architecture and high and low-level design.
  • Create test plan, test scenarios and create unit test scripts.
  • Develop Extract, Transform, Load jobs and scheduling scripts.
  • Prepare test environment and test data.
  • Involve in all phases of Software Development Lifecycle like Design, Development, Build, Unit Test, System Integration Test support, User Acceptance Test support, Production deployment and post production support.
  • Conduct meeting with offshore team to help understand business requirements and review code developed by offshore.

Confidential, PA

Techno functional lead developer

Responsibilities:

  • Created data dictionary to store golden records using Informatica MDM (Data Load, cleansing and standardization, Delta detection)
  • Worked on Informatica BDM (Big Data Management) to process data stored in data Data Lake.
  • Worked on DataStage migration from 8.5 to 11.3 by involving in capacity planning, creating users, coordinating with UNIX team to open port, setting up server, setting up Metastore.
  • Monitored DataStage jobs for 3 months
  • Created IBM tickets if not able to resolve issue from my end.
  • Work with Research Scientists / Business Team along with Business Analysts and Data Architects to define requirements and subsequently develop data model and other requirement documents.
  • Involve in business meetings to help translate business problems to technical components.
  • Research scope for automation and present it to business stake holders.
  • Perform data and gap analysis.
  • Analyze production data to support business users.
  • Design technical framework for process and document.
  • Involve in data modeling and table design discussions.
  • Develop algorithm for the codes to be built.
  • Database space and resource allocation discussion with Database Administrators.
  • Develop algorithm for the process to be built.
  • Build processes using multiple technologies and troubleshoot technical issues.
  • Build automation process and re-usable components for approved plans.
  • Troubleshoot technical issues for developers in Data Warehousing team.
  • Unit test the built process.
  • Business requirements gathering and documentation.
  • Explain business problem to programmers.
  • Work with project managers to prepare work effort estimates and in change management process.
  • Support system testing to resolve issues and address them in HP - ALM tool.
  • Support User acceptance testing by addressing concerns raised by end users and helping them understand the system.
  • Sign off components after production deployment and scheduling using IBM Workload Scheduler and support production during late hours in case of code abends.
  • Perform code review, design review and provide review comments for offshore developers.
  • Train offshore developers.
  • Conduct Teradata technical interviews.
  • Conduct defect triage meetings.

Confidential, PA

Offshore ETL Team Lead

Responsibilities:

  • Organize and attend daily walk through calls with client’s business system analyst.
  • Involve in daily status calls with onsite coordinator.
  • Coordinate with business systems analyst, data modelers and DBA’s.
  • Prepare dashboards and Causal Analysis Report.
  • Document source to target mappings based on requirements. Do source data analysis and highlight requirement gaps.
  • Design end to end solution for requirements and delegate work to resources.
  • Resolve technical issue faced by peer developers.
  • Estimate time/effort require for development and report to Manager.
  • Develop programs for complex requirements.
  • Conduct peer reviews (performance tuning and design) for junior developers.
  • Conduct sessions for junior developers.
  • Interview internal and external resources for onboarding to project.
  • Unit test case documentation.
  • Provide production support in absence of client.
  • Troubleshoot technical issues for other projects within the organization.
  • Design and code for re-usable and automation tools.
  • Career highlights and achievements
  • The client has an existing ETL framework built using DataStage, over which all programs run. The framework is used for controlling the job flow and capturing all kinds of audit controls, which is helpful for identifying the run status of each batch and how each program under the batch had run. Client needed to decommission DataStage to reduce licensing cost. The entire suite built by me, encompassing re-usable Teradata/ UNIX components, ETL framework and automation mechanisms to provide an efficient and cost effective replacement for IBM Infosphere DataStage. Currently, all developers in the team are deploying their codes to run over the framework built. The framework now supports program flow like running in parallel or sequence. It has re-start ability features as well.
  • Developed and automation process called Smart loader, to load file to tables dynamically, without the user having to write Mload or Fast load scripts. The user has to mention Metadata of target table in a configuration file/table along with details like filename and file delimiter, header and trailer skip. The underlying code used was UNIX and Teradata TPT.
  • Built an automation tool to bring data from SQL Server to Teradata, where the user needs to place the SQL Server SQL in a configuration table along with DSN details.
  • Created an automation tool to perform Soft Insert to Code tables, rather than the developer having to write codes.
  • Built a custom re-usable process to identify delta by storing the concatenated key columns and the HASH value.
  • Based on the output Count or Sum or Min or Max value of the SQL, there was a requirement to abend the process and trigger email with the checks that failed the type of comparison(less than greater than, etc.) in an email all contained in a table. I created a common re-usable tool which does the above and runs creates a HTML file and attach it in email, so that it appears as a table.
  • Created an automation tool to load files to simple code tables, by comparing key fields and deciding whether Update or Insert has to be run.
  • Re-designed Medical Claims process which was running for more than 13 hours, to make it run in less than 2 hours.
  • Created a tool for converting xls/xlsx to text files on Windows platform.
  • Developed Data quality tool, which profiles data in each field and returns the number of defaults and distinct set of values and more information useful for BSA’s.
  • Built end to end process for Medical Claims system.
  • Emerged as POC for automation using Teradata and UNIX.
  • Trained on Hadoop basics.

Environment: DataStage 11.3, Teradata 14 and 15, AIX UNIX, IBM Tivoli Workload Scheduler, TFS, Anthill Pro, HP ALM.

We'd love your feedback!