We provide IT Staff Augmentation Services!

Senior Technical Lead Resume

4.00/5 (Submit Your Rating)

SUMMARY

  • Over 12 years of hands - on expertise in IT as an application developer and leading team with experience in the life cycle of software development applications involving Requirements, design, Analysis and Testing.
  • Repeatedly demonstrated the ability to apply technical skills for the achievement of tactical and strategic business solutions, and leverages interpersonal skills to motivate, direct and inspire project team members.
  • Experienceand understanding of designing and operationalization oflarge scaledata and analytics solutions onSnowflakeCloud Data Warehouse.
  • Developing ETL pipelines in and out of data warehouse using combination ofTalendandSnowflakeSnowSQLWriting SQL queries against Snowflake.
  • Understanding data transformation and translation requirements and which tools to leverage to get the job done Understandingdata pipelinesand modern ways of automating data pipeline usingcloud-basedTesting and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions.
  • Expertise in Snowflake cloud database.
  • Worked on best practices to save cost while using Snowflake cloud database.
  • Have good understanding on Snowflake architecture.
  • Good understanding on Snowflake storage and processing cost model
  • Applied clustering on Snowflake table to improve performance
  • Worked on the Data Loading and unloading into snowflake andS3usingParquetandJSONfile formats.
  • Experience on differentSnowflake stages
  • ExperienceonExternal Tables, Data Sharing,Cloningin snowflake
  • Experience on AlayticsDX.
  • Expertise in implementation of transformations like Data Cleansing to improve the quality of data and Worked on Data Transformations, Data Loading, and Data Type Conversion.
  • Result Oriented, customer focus, articulate, conducting TEMPeffective meeting, mentoring and being proactive are some of my soft skills other than technical skill for which I am very well recognized across business units.
  • In-depth experience in Extraction Transformation and Loading (ETL) process using Hadoop, Pig Latin, Spark, Pentaho, Talend, Data stage and developing strategies for ETL.
  • In-depth experience on AWS, S3, HDFS, Hive, Hbase, Sqoop, oozie, Pig Script.
  • Integrated Hadoop into traditional ETL, accelerating the extraction, transformation, and loading of massive structured and unstructured data
  • Designing and developing POCs in Spark using Scala to compare the performance of Spark with Hive and SQL/Oracle
  • Excellent Live experience in Data Warehouse Design, Development, Testing.
  • Experience on Analytics tools like Tableau and Qlikview.
  • Experience working with Pentaho Smart Grid Execution for parallel processing to improve job performance while working with bulk data sources.
  • Successfully interacted with end-users, developers, business analyst and top management.
  • Proven skills in BI Technologies using different Designer tools, ETL and OLAP tools.
  • With strong process knowledge, halped the Project in bridging the gaps between various teams - Development, DBA, Change Management, and Code Management.
  • Prepare & Review ETL Test Plan, Test scope Documents and Test cases, Key resource in successful execution of - UT (Unit Testing)
  • Excellent problem solving, issue resolution & technical skills coupled with confident decision make for enabling TEMPeffective solutions lead to the high customer satisfaction
  • Worked in onshore-offshore model, managed and development team and delivered complex projects following the quality standards

TECHNICAL SKILLS

Operating systems: Windows XP, UNIX/LINUX

Database skills: Oracle 10.X/ Oracle 11G, SQL Server, MySQL, PostgreSQL, Teradata

Languages: SQL, PL/SQL, Scala

Big Data: Hadoop, HDFS, HIVE, HBASE, PIG, SQOOP, FLUME, SPARK

ETL Tools: Pentaho, Talend, Datastage.

ELT Tool: AnalyticsDX

Cloud: AWS, S3, Snowflake

Scheduling Tool: Control M

Warehouse Methodologies: Kimball, Inmon, Kimball/Inmon

Design Tools: Enterprise Architect

OLAP Tools: Mondrian

Reporting Tools: Pentaho, Jasper, Qlikview, Tableau

PROFESSIONAL EXPERIENCE

Senior Technical Lead

Confidential

Responsibilities:

  • Involved in Requirements gathering through weekly Tech Governance Call or through other forms like mails/calls/Meetings/chats etc with business team.
  • Created Design Documents which includes prototypes of expected data layouts and obtained the approval for design documents from business.
  • Coding and Implementation of the Application Framework end to end in Talend Big Data Spark Batch jobs and Standard jobs.
  • Involved in creating test plans for the Unit and System testing and execution of those plan with the successful results.
  • Implement the business logic in Transformations in Talend studio.
  • Develop the jobs in Talend by using the Talend components to write the output to S3 and Snowflake.
  • Creation of stages and external tables in Snowflake.
  • Create SnowSQL to analyze the data in Snowflake.
  • Writing the Copy commands and scripts to load data from S3 to Snowflake for data validation for other teams.
  • Responsible for all the ad hoc requests like For DEV and Test environment sync-up with Prod by using S3 Unload and copy
  • Ensure ETL/ELT’s succeeded and loaded data successfully in Snowflake.
  • Develop the maps in Analytics DX to migrate the components from Talend
  • Deploying the Code in TAC(Talend Administrative Console) and maintain the scheduling and provide production support .
  • Maintained the Version control of the code through GIT Repo.
  • Ensure smooth execution of the project by evaluating the work carried out by team members.
  • Always updated progress of the Deliverables to concerned Tech owners and Business members.
  • Handled user queries related to issues and doing analysis for issues pertaining to Connext Legacy Application.
  • Ensured the Code Changes are properly integrated and Released to Prod on scheduled timelines without any issues.
  • Point of contact for Any Connext Legacy system issues or queries.
  • Provide technical halp to team members for resolving the user queries.

Environment: Talend Real Time With BigData 6.2/6.4,Spark FrameWork 2.1, Snowflake, Cassandra, AnalyticsDX, AWS S3, HDFS, HUE,TAC,UNIX Shell scripting, GIT Repo,Java,Putty, WinScp, JIRA.

Senior Technical Lead

Confidential

Responsibilities:

  • Handled importing of data from various data sources, performed transformations usingpig and Hive.
  • Loaded data intoHDFSand Imported/exported the data from oracle and sql serverinto HDFS usingSqoop.
  • Analyzed the data by performingHive queriesand runningPig scriptsto reports for customer.
  • Responsible for buildingscalable distributed data solutionsusingHadoop
  • Designing and developing POCs in Spark using Scala to compare the performance of Spark with Hive and SQL/Oracle.
  • Extensively worked on sqoop, transformation jobs, intermediate hadoop jobs.
  • Developed ControlM Job to Scheduling the jobs on the bases of files arrival, etc.

Environment: CDH5, Hive, Pig, Spark, Scala Oracle11g, Flat Files, SQL Server 2008, Unix Shell Scripting, ControlM.

Senior Technical Lead

Confidential

Responsibilities:

  • Lead the development for delivery of scalable and high performing high volume DW system.
  • Accountable for retiring legacy EDW system.
  • Understood the Mapping documents, existing Source Data and prepared load strategies.
  • Performed data validation on the data ingested using Data Profiler by building a custom Data Rule Language to filter all the invalid data and cleanse the data
  • Designed and implemented Hive and Pig UDF's for evaluation, filtering, loading and storing of data.
  • The Hive tables created as per requirement were Internal or External tables defined with appropriate Static and Dynamic partitions, intended for efficiency
  • Worked extensively with Sqoop for importing and exporting data from Oracle and Teradata into HDFS and Hive
  • Performed complex Joins on the tables in Hive also extensively used Pig for data cleansing.
  • Load and transform large sets of structured, semi structured using Hive and PigLatin
  • Developed workflow in Oozie to automate the tasks of loading the data into HDFS and Transformation with Pig
  • Used Oozie to orchestrate the Map Reduce jobs and worked with HCatalog to open up access to Hive's Metastore and used hive for final reporting using Denodo
  • Mentor the team both in technical and non-technical aspects.
  • Participate in implementation, validation and rollback review for any database changes as part of change management process.

Environment: Hadoop, Hive, HBase, Pig Latin, Sqoop, Oracle and Teradata

Senior Technical Lead

Confidential

Responsibilities:

  • Lead the design and development for delivery of scalable and high performing high volume DW system.
  • Accountable for retiring legacy system.
  • Prepared the High Level design document.
  • Understood the existing objects, routines in Datastage and implemented the same in Pentaho.
  • Implemented the new functionalities like Restartability, Parallel Execution in Multi Nodes which are not out of box features of Pentaho.
  • Implemented the Smart Grid concept for parallel execution of the jobs in Pentaho.
  • Prepared ETL jobs in Pentaho to generate Data warehouse reconciliation reports.
  • Assist people understand the big picture.
  • Helped the Team to work on performance tuning of the jobs to meet the SLAs.
  • Gathering requirements from business and preparing the design document.
  • Helped QA team to prepare the test cases.
  • Fixing the issues in QA and halping the production team to deploy the jobs into production.
  • Assist QA people with test case development and implementation.
  • Mentor the team both in technical and non-technical aspects.
  • Participate in implementation, validation and rollback review for any database changes as part of change management process.
  • Supported the Production Team during Warranty Period.

Environment: Datastage 8.5, Pentaho 4.4.2, Oracle 11G

Technical Lead

Confidential

Responsibilities:

  • Analyzed the subject areas and understanding the requirements.
  • Co-Ordinate with Business Analysts in order to understand the requirements and to develop the code.
  • Served as the data modeling expert, analyzed the source data and designed both the logical and physical data model using EA and lead the creation of all ETL specifications.
  • Responsible for development, testing, installation of Pentaho.
  • Configured the Pentaho Sever with their LDAP directory.
  • Designed ETL Transformations and jobs using PDI and Setting up the Carte server for remote execution and clustering to make fast the process.
  • Involved with visualization to prepare the reports and dashboards by using Pentaho.
  • Responsible for developing test scripts, test cases, technical design documents
  • Designed OLAP Cube for business layer and developed MDX Query for development of dashboards analytical services such as action rules and input measures
  • Responsible for maintaining solution integrity of the project dat include controlling scope, managing solutions for change requests and clarifying solution capabilities during all delivery phases

Environment: Pentaho ETL, OLAP, Dashboards, MySQL, SQL Server, UNIX, MDX, Enterprise Architect

Senior Software Engineer

Confidential

Responsibilities:

  • Understood the data model and designed the jobs in Pentaho Kettle.
  • Actively involved in design phase of Pentaho Jobs and Dashboard development for the product.
  • Used Bulk Loader for Full Load to Improve the Performance.
  • Used Java Expression to perform Data Cleansing and avoiding the use of JavaScript in order to improve the Performance.
  • Unit testing of the developed jobs as per the testing standards.
  • Creating the component test plans and document for the jobs and sequences which I have built.
  • Reviewing the developed jobs based on the build review checklists
  • Debugging the jobs with different test cases information for the correct output vales.
  • Supported the team in system testing, System Integration Testing and Regression Testing.

Environment: Pentaho BI Suite, MySQL, SQL Server, UNIX, MDX

Senior Software Engineer

Confidential

Responsibilities:

  • Requirement mapping and finalizing project functionalities and design documents.
  • Created the transformations using PDI which will pull data from Google Analytics server to staging server.
  • Created the elasticubes using elasticube project manager.
  • Creating ETL transformations and Jobs and implementing the scripts to create the data warehouse in production server.
  • Automating the whole Process which updates the dimensional model in monthly basis.
  • Unit testing of the developed jobs as per the testing standards.
  • Supported the team in system testing, System Integration Testing and Regression Testing.

Environment: Pentaho Data Integration, MySQL, PrismCube.

Software Engineer

Confidential

Responsibilities:

  • Configuring the Pentaho BI server with Tomcat server (Manual Deployment).
  • Worked on the visualization part to customize the BI Server
  • Preparing the OLAP schema for analysis.
  • Prepared MDX Query which will retrieve the records from OLAP
  • Created the Business model in metadata editor for adhoc reporting
  • Created OLAP Cubes to do analysis.
  • Writing the MDX and XACTION to call the OLAP Cubes.
  • Created reports using Pentaho report designer.

Environment: Pentaho BI Server, Pentaho Data Integration, MYSQL, Schema Workbench, Pentaho Report Designer, Pentaho Metadata Editor, Apache Tomcat

Software Engineer

Confidential

Responsibilities:

  • Understood the requirement and Involved in the manual configuration of Pentaho with JBOSS server.
  • Worked on the graphical reports by using Pentaho chart engine.
  • Worked on the admin module to create users and roles.
  • Created OLAP Schemas for analysis reports
  • Created MDX to connect the OLAP schema to prepare reports.
  • Worked on the xactions for dashboard and for pivot tables.
  • Created OLAP Cubes to do analysis.
  • Created reports by using Pentaho report designer.

Environment: Pentaho BI Server, Schema Workbench, Design Studio, JBOSS, Postgres

We'd love your feedback!