Etl Developer Resume
Kansas City, MO
SUMMARY:
- Inclination to grow and evolve into a true professional in the domain of Software Development with multifaceted skills.
- Create better relationship with the colleagues and management to achieve maximum throughput.
- Enthusiastic and supportive team player, provide best quality using innovative solutions to meet ever - changing business requirements within diverse industries, expand my programming knowledge and to be an invaluable asset to the organization.
- Over 5 years of IT experience as an ETL/ SQL developer and in-depth proficient experience in T-SQL, Data warehouse, OLTP, SQL Server , C#, Java, C, C++ and Big Data Hadoop.
- Strong working experience in creating Tables, Views, Indexes, Triggers, Stored Procedures using T-SQL (DML and DDL) and debug/analyzed complex stored procedures.
- Experience in writing T-SQL queries, Rank functions, Windowing function, CTE and derived table.
- Extensively worked in SSIS Transformations like Lookup, Aggregate, Cache, Character Map, Conditional Split, SCD, Data Conversion, Derived column, Script component, Pivot.
- Extensive experience in creating ETL solutions with different data sources like RDBMS, Flat file, XML, Excel.
- Experience in creating Jobs, Alerts, SQL Mail Agent, and schedule SSIS Packages using SQL Server Job Agent and Windows Scheduled task.
- Experience in command line execution of SSIS Package.
- Certified Hadoop Developer. Qualified as Industry-Recognized Hortonworks Data Platform Certified Developer (HDPCD) on Apache Hadoop Frameworks by performing actual hands-on big data tasks on a Hortonworks Data Platform (HDP) cluster.
- Possesses strong knowledge & understanding of Apache Hadoop Frameworks and Hortonworks Data Platform 2.4 installed and managed with Ambari 2.2, which includes Pig 0.15.0, Hive 1.2.1 , Sqoop 1.4.6 , and Flume 1.5.2 , HDFS . Attended Centriq Workshop on “What is Big Data Really?”
- Hands on experience in Microsoft SSAS, SSRS , MS Office Suite , IDE : Microsoft Visual Studio , Confidential ADE, Eclipse, and NetBeans. Coded in Programming Language: C , C++ , C#, Java , Data Structure, DBMS and Scripting Language: PERL , SHELL
- Deep analytical and technical aptitude, client services skills, data analysis expertise and the ability to quickly learn/apply technologies and solve business issues in a time sensitive environment.
- Strong collaboration and a keen communicator with excellent interpersonal skills. Result oriented hard worker with initiative & energy. Team building skills with proficiency at grasping new concepts & utilize them in a productive manner. Successfully completed Nalanda Training, Gurgaon-India for duration of 3 months. Done a mini project on IP Fragmentation and Reassembly (SDLC). Attended ACF Training to improvise Behavioral competencies (Communication, Collaboration and Learning & Innovation) and Technical competencies (Design, Coding, Review and Quality).
SKILL:
MS: SQL Server 2005/2008 R2/2012/2014, Confidential 10g/11g, MYSQL
ETL/SQL Tools: Microsoft SQL Server Integration Services (SSIS), SQL Server Management studio (SSMS).
MS Office Suite, CVS, Bugzilla: IDE
Visual Studio 2010/2013/2015 : , ADE, ECLIPSE, NETBEANS
Big Data Frameworks: Apache Hadoop ( Hortonworks HDP-2.4) HDFS, MapReduce, Yarn, Hive, Pig, Hive, Sqoop, Flume, Spark-Scala and Web Interface Ambari.
Operating System: Windows, Linux, Android
Programming Language: Struct C, C#, C++, Java, Data Structure, T-SQL
Scripting Language: PERL, SHELL
Web Technologies: HTML, CSS, JSP, XML, PHP, JSCRIPT
Debugging Tool: Valgrind, splint, gdb, gcov, gprof
Filesystem: Confidential ACFS, HDFS
Area of Interest: Business Intelligence Big Data, Data warehouse.
SDLC: Agile, Waterfall, Iterative
WORK EXPERIENCE:
Confidential
ETL Developer, Kansas City, MO
Responsibilities:
- Worked in software development life cycle phases, relational database development, product support, and market analysis.
- Collaborated with the application developers and subject-matter experts to create conceptual, logical, physical data models, E-R design of the systems meeting requirements, business objectives and rules.
- Designed, developed, reviewed, tested and tuned performance of stored procedures, T-SQL, SQL queries, support ETL mappings and scripts for data marts using Microsoft SQL Server 2008/2012, OLTP, and Microsoft SQL Server Integration Services SSIS 2008/2014.
- Designed ETL processes using SSIS to extract the data from flat file and to load the data into SQL server.
- Implemented a Load Control system to keep track of the external source files execution.
- Load Control logic will also check for the order of execution of the files.
- Responsible to build the advance SQL queries, procedure, scripts, triggers, index and cursor. Developed testing scripts using T-SQL for validating new products/enhancement.
- Implemented triggers on DB objects for automatic insertions and updates. Created views from data on one server to another.
- Implemented Error handling and Custom Logging to keep track of Package flow and Package Execution status.
- Converted the SSIS package from 2008 version to SSIS 2013.
- Created complex T-SQL using Rank function and CTE.
- Created command line execution of the package in a bat file and scheduled it in a Windows Scheduled task.
- Implemented a logic in C# code to download and parse JSON files through web API in SSIS.
- Created SSIS task to archive the flat files and move to the archive folder.
- Involved in fixing issues when the scheduled job fails.
- Review and Testing the Packages.
- Researched, analyzed, rectified and verified performance related issues in a time sensitive environment.
- Wrote SQL queries and stored procedures to generate data for reports
- Got appreciation from Team and Customers for proven competency.
Environment: SSIS, SSRS, SQL DDL DML CURD, T-SQL, MS-SQL Server 2014, Confidential, MYSQL, Windows 2008 & 2012 Server, AGILE SDLC. Software: MS Office Suite, Visual Studio 2013/2015.
Confidential
Software Engineer
Responsibilities:
- Work closely with business technology analysts to understand the design and business requirements from team. Wide experienced in estimating time for each requirement, design object models and class diagrams, create reusable components.
- Expertise on setting up the system for various software and tools needed for development. Setup standalone and portable Confidential cluster-ware PCW. Start ASM instance to use ASM disks. Create volume for mounting Filesystem.
- Extensive experience in developing java program adhere to coding standards specified by technical management. Proficient in using Object Oriented Analysis and applying proven design patterns in building high performance applications.
- Worked in ACFS Functionalities like ACFS Compression and Replication. ACFS Compression: cost-effective way to cut down the need of disk storage in varies clientele ecosystems. ACFS Replication: provides disaster recovery capability for the file system.
- Developed Java application to determine and manage the system capacity and stability. Designed highly responsive application on a Java IDE. Experienced in Confidential Application Development Framework (ADF) that simplifies Java Development.
- Expertise in application development using Multi-threaded, Multi-Process and FOST applications that designed, coded and run to ensure that ACFS capable enough to scale without degrading its performance, stability for handling large files and performance in Port Platforms including Linux, Windows, Solaris and AIX Linux (x86 64-bit). Involved in the design of use case diagrams, sequence diagrams, and class diagrams of Java application.
- Written Perl Scripts for validating the use cases that runs across different Operating Systems under different loads. In line Source Viewer for test failures and triaging of the issue was done to increase the productivity. Used Shell Scripts performing complex tasks for analyzing the issue.
- Use of IDE for developing environment like Eclipse, NetBeans, ADF. Experienced in using source code change management and version control tools ADE Confidential Advance Developed Environment as a repository for managing/ deploying application code and to check in/out files, etc. also responsible for branching, merging the branch to main repository.
- Talented at managing requirements analysis, functional specifications, and requirement documentation. Capable of Learning new technologies quickly and adapting to a new environment. Excellent planning, monitoring and troubleshooting skills to solve issues in Front-end and Back-end.
- Received appreciation from analysts and users for developing a highly interactive java application that determine each file exact storage need, compression ratio and rate of fragmentation of each file.
Environment: Linux, Windows, Solaris and AIX Linux (x86 64-bit), Java, PERL, SHELL, Confidential ADE (Advanced Development Environment), ADF, Eclipse, NetBeans.
Confidential
ETL Developer
Responsibilities:
- Requirement gathering, analysis and data modeling.
- Designed ETL processes using SSIS to download flat files from SFTP and extracts the data from flat files, and to load the data into the target database.
- Created SSIS task to archive the flat files and move to the archive folder.
- Designed ETL processes using SSIS to extract the data from flat file and to load the data into SQL server.
- Created the SSIS packages using different transformations Data conversion derived Column, Lookup union All, Slowly Changing Dimension.
- Wrote SQL queries and stored procedures to generate data for reports.
- Created Complex T-SQL queries and stored procedure using CTE, temp tables, Ranking functions, string functions and aggregate functions.
- Created Non-Clustered indexes to improve query performance and query optimization.
- Created table constraints and validated database integrity.
- Implemented Error handling techniques.
- Familiar with big data architecture including Hadoop data ingestion-Sqoop, Flume, data transformation-Pig Scripts and data analysis.
- Integrated logging methods to trace errors, analysis problems, fix bugs and ensured greater service connections and quality. Expertise to deep dive into technical holes and rectify with best solutions.
- Executed Hive queries that improve back-end stability, suit best for business intelligence and market analysis and enriched consumer satisfaction.
- Attended internal Hadoop training program.
- Fixed the bugs raised after analyzing the code. Curiosity to explore new ideas and concepts making me to be a keen communicator with excellent interpersonal skills.
- Guided new joiner with product/technical training in the project.
Environment: SSIS, T-SQL, MS-SQL Server 2008 R2/2012, Confidential, MYSQL, Windows 2008 Server, Big Data Hadoop , Hortonworks Data Platform HDP, and Microsoft Visual Studio.