We provide IT Staff Augmentation Services!

Technical Lead Resume

3.00/5 (Submit Your Rating)

SUMMARY

  • Having 12+ years of career experience IT on SDLC Process like Designing, Testing, Development, Implementations and Quality Assurance of software systems in various industries.
  • Around 7+ years of experience in the field of Data Analysis, Data extraction, ETL stage, Reporting, Business Intelligence and Application development
  • Having 3+ years of experience on Agile methodologies on development, deployment and production support.
  • Expert in data importing into various database objects thru flat file, ETL, DB import and BCP in/out.
  • Data inventory management from different database systems thru reconciliation accomplished by SQL, ETL and Excel macro.
  • Writing of SQL complex queries, PL/SQL stored procedures and functions, Packages, Database Triggers, oracle PLSQL Cursor.
  • Proficient in the design and implementation of Star, Snowflake schemas and multi - dimensional modeling.
  • Having experience in various data reading concepts including text image to editable text (using OCR) technologies.
  • Extensively worked in Client-Server application development using Oracle 8.0/8i, 9i, 10/g,11i PL/SQL and SQL *PLUS.
  • Extensive experience in application data modeling and building data warehouse architecture (Star &Snowflake schemas) and ER Diagrams using Microsoft Visio.
  • Extensively worked in PL/SQL for creating stored procedures, database triggers, exception handlers, cursors, cursor variables.
  • Having 3+ years of data transfer thru replicate pipelines, data mining including AWS data lake.
  • Sentimental analysis on customer feedback data using NLP, R libraries.
  • Having 3+ years of experience on Qlik Replicate versions from 3.1 to latest 6.3. Expertise Load Reduction, Restricted Data Access, Improved Service, Geographic Distribution, Disaster Recovery and Load Data, Create Copies of End Points, Distribute data across endpoints etc.
  • Extensively Worked on Oracle Database in Data ware housing technologies.
  • Working knowledge Toad, SQL Loader for performance tuning and database optimization.
  • Extensively worked in writing PL/SQL stored procedures, database triggers, exception handlers and cursors and PL/SQL test cases.
  • Experienced in working with many complex SQL queries with joins, functions, cursors, tables, materialized views, sub-queries and analytical queries to generate reports.
  • Expert in maintaining data integrity using various constraints and database triggers.
  • Proficient in interpersonal relations, communication, outsourcing partner teamwork and organization with excellent communication and presentation skills.
  • In Depth knowledge in Life Cycle Development including requirement analysis, design, development, testing and implementation.
  • Performed Import & Export of various data formats using DTS, BCP, Bulk Insert and SQL 2008/2005 Integration Services.
  • Handling Various file formats including JSON and EBCIDIC using COBOL Serde/copy books
  • Performed forward Engineering and Reverse Engineering using Erwin CASE tool
  • Developed Interactive Reports using Crystal Reports
  • Experience in Analysis Services for designing cubes and dimensions
  • Provided support to the testing team during System Testing, System Integration Testing (SIT) and User Acceptance testing (UAT).
  • Experience with UNIX OS and shell scripting.
  • Strong Organizational and communication skills
  • Excellent analytical skills, very good team player, highly motivated, enthusiastic and meticulous
  • Skilled at MS Office tools including MS Excel, MS Visio and MS Access.
  • Very good understanding of Relational Database, Data Modeling, Cubes, Data Mapping and Data-warehousing and expertise in writing Requirements, Technical Specs, Test Plans and Test Cases.
  • Good time management and multitasking ability which helps in conducting project meetings, reviews, walkthroughs, and customer interviews according to the varied needs of the people involved.
  • Strong skills performing data analysis, Impact analysis, Risks analysis, writing test cases for UAT.
  • Experience in Agile / SCRUM with sprint methodologies.
  • Doing Online IBM certification on Data scientist.

TECHNICAL SKILLS

Operating System: Windows Server 2000/2003/2008/2012 , UNIX and AIX

Databases: MS SQL Server 2000/2005/2008/2012 , MS Access, Sybase, Hive, SQL Server, Sybase, Oracle and MS Access, HBase

Programming/ Scripting & Services: Python, Spark & Pyspark, PowerShell Script, SQL, PL/SQL, Sybase subroutines, Unix Shell scripts, Cron scripts & scheduling, MQ, JSON/EBCIDIC/CSV/FLAT file handling. IMS/MF copy books flattening and handling.

Database Tools: Query Analyzer, DTS, BCP, BI, Visual Studio DTS, Replication and Always ON, Qlik Replicate, Qlik Enterprise manager, Qlik Information Suite, Voltage Encryption, MS Excel Macro and VBA developing and MS Visio, SSMS, SQL Server Enterprise Manager, SQL Profiler, Query Analyzer, SQL Server Configuration utility, IBM Websphere

OLAP/BI Tools: Tableau, Alteryx, Qlik dashboard and sense, Cognos 11, SSRS, SSAS, Crystal Reports and Crystal Enterprise

ETL: Ascential Data Stage 7.1, Expressor, BCP, DTS (Data transformation services), SSIS, QlikReplicate, Alteryx and Informatica BDM

PROFESSIONAL EXPERIENCE

Confidential

Technical Lead

Responsibilities:

  • Working on creating the framework to ingest the data from various different source technologies like SQL, Oracle, Teradata, DB2, IMS, File based to Hadoop Environment. working on building the framework with all the components of Hadoop EcoSystem like Map Reduce, HDFS, HIVE, PIG, SQOOP, Java, Oozie, Cassandra, Kerberose. Redesigned and developed the critical ingestion pipeline to process over 200 TB for a Source. Developed pipeline to process the data files from External vendors and created a solution for Data Analytics team with the interface to analyze the data. experienced on SQOOP to launch the data from different sources to Hadoop Environment.
  • Involved in daily data ingestion using Qlik and the transformation of the data scripts in PIG and HIVE to perform the different types of data validation before ingesting the data to Hadoop.
  • Using Qlik Replicate tool, to replicate SQL Server, Oracle, DB2, IMS, and AS400 data sources into Hadoop.
  • Replicating around 300 applications source database into Hadoop.
  • Based on requirements landing data on CDC and Fill & Fill and Kill & Fill by Qlik Replicate
  • Stitching base load table and CT tables in hive. Involved in identifying the right technology to land the data based on Source type, technology constraints.Trouble shooting on failure, data copy, loggings and repository on Qlik Jsons. Global Transformations on Qlik task level, transforming columns in table level. Full load settings, Store changes, large object handlings in Qlik task level. Handing source and target end point settings, Internal parameters, Change processing timings and different landing formats. Archival of existing tables by task, partitioning the target files and handling the DDL changes
  • Experience on Hue, HDFS file server, Hive Database and OOZIE Job scheduling and execution.
  • Experience on Hadoop, HUE, Hive, Attunity, Beeline, Sqoop, Qlikview, Commandline operation with HDF and Hive. Experience in handling External and Internal Ingestion with Hadoop.
  • Writing Pyspark scripts for consolidation the partitions created by Qlikreplicate and small file consolidations.
  • Used Hadoop environment for Ingestion and Building Job for automatic execution.
  • Worked on source and target latency issues, advanced restarts, resuming task with more options.
  • Involved in importing and exporting data (SQL, Oracle, CSV and text files) from local/external file system and RDBMS TO HDFS
  • Implemented Always-On Availability between Primary and secondary Production servers. working on the data transformation like stitching, merging and transforming the data scheduling the Oozie workflow Jobs to execute the Map Reduce, Sqoop, PIG and HIVE Jobs. working on POC to import the streaming data thru Kafka pipeline and NIFI workflow Thru QlikReplicate. automating the scripts in order to simplify the HIVE table creation, various configurations and release activities. providing the Post implementation support to the applications which were launched. Working with Source Database teams to understand the problems, complexities involved within the application, providing the solution, creating an end to end solution from Source and Target data landing process. working on Creating shell scripts & scheduler jobs for workflow execution

Confidential

Sr. Programmer Analyst/Tech. Lead

Responsibilities:

  • Working with Business partners, Plant ITM on business analysis, requirement gathering and BRD design.
  • Impact assessments on existing system and technical requirements for the targets systems
  • Data validations on source and target tables, Rejects handling and report designing
  • Application support activities on Database side, trouble shooting on error logs, cronjobs designing for monitoring and user manual preparations.
  • Database Designing & developing and Performance analysis.
  • Extensively used the External Tables for data conversion in staging area before loading the data into the respective dimensional tables and fact tables
  • Created different schemas and maintained schema objects.
  • Creating and editing database tables, views, Materialized views constraints, indexes on the tables for faster retrieval of the data to enhance database performance.
  • Defined data base triggers & PL/SQL stored procedures for business validations.
  • Writing PL/SQL Stored Procedures, Functions to enforce the business rules.
  • Extensively used Indexes for better performance of queries.
  • Extensively wrote UNIX Shell scripts to automate archival, purging, production and ODS and test server’s health checks and automatic Job Scheduling.
  • Primary role in Implementing Data Warehousing strategies in various report modules for Chrysler in-house applications.
  • Involved in Cron job monitoring and production support.
  • Source system data from the distributed environment was extracted, transformed and loaded into the Data warehouse database.
  • Functional Analysis & UI Design.
  • Database maintenance activities (Backup & Restore).
  • Involved in Technical Documentation, Functional Documentation, Unit test, Integration test, and writing the Test plan.
  • Participated in System test case preparation and execution.
  • Working on tuning the SQL Queries for better performance.
  • Applying constraints and writing triggers to apply the business rules.
  • Performance tunings and Partitioning the fact tables and materialized views to enhance the performance.

We'd love your feedback!