We provide IT Staff Augmentation Services!

Ssis Developer Resume

3.00/5 (Submit Your Rating)

Alpharetta, GA

SUMMARY

  • Over 5 years of experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, Client/Server applications
  • EMC certified Data Science Associate with proficiency in data mining through statistical analytical techniques such as classification, association rules, sentiment analysis, topic modeling, time - series analysis and regression
  • Extensively used SQL, PL/SQL, T-SQL to create complex queries, SQL Override, Stored Procedures, and Triggers, expertise in writing Unix and Power Shell Scripts
  • Experienced in Azure SQL server DB deployment
  • Good knowledge in planning and implementation of data and storage management solutions in Azure (SQL Azure, Azure files, Queue storage, Blob storage)
  • Implemented PostgreSQL system backup procedures and updated site resources
  • Extensively worked on database applications using Oracle, SOL Server, PostgreSQL and MySQL.
  • Advanced experience working with MS SQL Servers 2012/2016/2017/2019 and SQL Azure
  • Extensive knowledge and experience in dealing with RDBMS including Normalization, Stored Procedures, Constraints, Querying, Joins, Keys, Indexes, Data Import/Export, Triggers, cursors
  • Involved in creating Dashboards, reports as needed using Tableau Desktop and Tableau Server
  • Handling all the domain and technical interaction with application users, analyzing client business processes, documenting business requirements
  • Well versed with Data Migration/Conversions, Data Extraction/ Transformation/Loading using PL\SQL Scripts
  • Worked on the development of Dashboard reports for the Key Performance Indicators for the top management using various Tableau functionalities like Tableau Extracts, Parameters, Filters, Contexts, Data Source, Filters, Actions, Functions, Trends, Hierarchies, Sets, Groups, Calculations, LOD (level of detail expressions), Data Blending, Maps, Joins, Dual Axis charts, Reference lines and Bins
  • Experience in dealing with Apache Hadoop components like MapReduce, HDFS, Hive, Pig, Sqoop, Big Data, Pivotal HD, HAWQ, Alteryx
  • Create workflows to fetch data from different sources to HDFS using Alteryx and Schedule jobs
  • Knowledge in Express IT, Metadata hub, plans and Data Quality Environment in Ab Initio
  • Good knowledge in RDBMS and Dimensional Modeling (Star and Snow Flake)
  • Proficiency in creating scripts and monitoring jobs in Control-M
  • Experience in assigning the Azure services on locations specific to integrate with web-apps and key-vaults
  • SDLC Methodology experience in large data warehouse environment
  • Extensively used the tools like HP ALM (Quality Centre), WINSCP, SQL Developer, Teradata SQL Assistant, MS Office suite
  • Reviewed Source to Target Mappings and validated business rules provided by Business Analysts
  • Involved in performance tuning of Ab Initio code, SQL queries and Database Load utilities
  • Provided DBA support during incidents, deployments and DR failovers

TECHNICAL SKILLS

Operating Systems: Windows, UNIX, LINUX

Databases: MS Access, Teradata, SQL Server, Oracle, PostgreSQL

ETL Tools: Ab Initio, Informatica Power Center, SSIS

Bug Tracking Tool: PVCS Tracker, Jira

Testing Tools: Quality Center, HP ALM, JIRA, Azure

Scripting Languages: C, SQL, PL/SQL, T-SQL, UNIX Shell scripting, Power Shell scripting

Scheduling Tools: Autosys, Tivoli, Maestro, Control- M

IDE: Visual Studio 2019

Cloud-Platform: Microsoft Azure, Callidus

BI/Reporting Tools: Tableau Desktop 9.x, 10.x, 2018.x, 2019.x, MS Visual Studio, R Studio, SAS Enterprise Miner

PROFESSIONAL EXPERIENCE

Confidential, Alpharetta, GA

SSIS Developer

Environment: Visual studio 2019 preview, Microsoft SQL Server Management Studio, Microsoft Azure, T-SQL, Power Shell

Responsibilities:

  • Working on SQL 2012/2016/2017/2019 SQL server Database engine as needed in the Development environment
  • Actively participated in interaction with users, team lead, DBAs and technical manager to fully understand the requirements of the system
  • Experience in setting up connection strings and connecting SQL Server Azure Databases from locally installed SQL Server Management Studio (SSMS)
  • Experience in developing SSIS packages that can extract data from various sources and merge into one only source for Power BI reports
  • Worked on implementing backup methodologies by Power Shell Scripts for Azure Services like Azure SQL Database, Key Vault, Storage blobs, App Services
  • Involved in supporting DB activities like adding/dropping of columns and tables, creating/modifying clustered/non-clustered indexes and stored procedures, adding entries in seed script for tables
  • Performing SQL Server Replication as required to maintain consistency and integrity of the data
  • Primarily involved in Data Migration using SQL, SQL Azure, Azure storage, and Azure Data Factory, SSIS, PowerShell
  • Involved in development and testing of code changes before deploying to lower environments
  • Involved in automation of administrations tasks like Database Backup, Backup AG, Backup Log, Daily AG, DB File Stats, Drive Space, Integrity Check, Mirroring, DB Refresh, Storage account delete using PowerShell Scripting
  • Responsible for migrating on premise SQL servers to Azure either PaaS /SaaS/IaaS Server Databases
  • Involved in tuning the TSQL queries using performance tools like Query Analyzer, Execution plan, SQL Profiler
  • Performing comparison between the production releases and providing the output of stored procedures/functions/columns/tables newly added or dropped

Confidential, Northbrook, IL

ETL Developer

Environment: Ab Initio (GDE 3.5.1 with Co-Op version 3.5.1), Enterprise Metadata Environment (EME), UNIX, PostgreSQL, Oracle, WinSCP, JIRA

Responsibilities:

  • Interacting with the business users to understand the requirements/needs and discuss the possible technical solutions to deliver the results for business review
  • Lead the estimation, review the estimates and communicate to all the stakeholders so that the customer is aware of the effort involved
  • Visualizing the complexity involved in developing/updating the requirements using Oracle SQL Developer, Ab initio GDE 3.5.1 and UNIX
  • Performing Data cleaning and creating DMLs to be used in Ab Initio for the XML files
  • Utilizing various components, transformation functions, filters, formats, partitions, sub graphs and lookups for developing data and files discussed during the design sessions
  • Creating Functions, Stored procedures, Packages, Views, Triggers in Oracle SQL Developer as per the requirements for the programs in the project
  • Developing graphs in Ab Initio that are reusable and can load different tables using different loading strategies and developing UNIX wrapper scripts which can call Ab Initio deployed graphs
  • Implemented a Parameterized process using PSETs which was used to create numerous extracts based on the contract, individual and group specific combination
  • Responsible for Performance Tuning of complex SQL statements and Data mapping in Ab Initio
  • Performing code reviews and data validation of the UNIX shell scripts, SQL queries and the rules build in Cloud to perform the calculations based on the metrics received from Upstream
  • Scheduling the Ab Initio graphs (jobs) to run in Production
  • Installed, configured and migrated existing PostgreSQL databases
  • Maintained and optimized PostgreSQL database application performance through tuning

Confidential, Riverwoods, IL

ETL Developer

Environment: Ab Initio (GDE 3.1 with Co-Op 3.1.6), Enterprise Metadata Environment (EME), Oracle (11G) Teradata (14.1), UNIX shell scripting, Control-M, WinSCP, Quality Center, Internal File Transfer Utility, SQL Developer, Enterprise Change Management Tool

Responsibilities:

  • Responsible for requirement gathering, analysis and designing of the Ab Initio graphs
  • Created Ab Initio graphs that transfer data from various sources like Oracle, flat files and CSV files to the Teradata database and flat files
  • Development of UNIX shell scripts and purging the data using Ab Initio instead of PL/SQL queries
  • Build the Code and performing Unit testing for the various monthly version requests
  • Building generic graphs which can be updated with latest techniques and increase the performance as well as reduce the development time
  • Contributed in File Watcher script checks for existence of file and ensure that received file is complete
  • Worked on performance tuning of Ab Initio Graphs to reduce the process time
  • Creation of various projects artifacts/logs such as UTCs, RTM, Issue log, Prod Hand Off document
  • Optimizing performance by analyzing, ETL workflow and query patterns
  • Dealing with processing logic, job scheduling or performance issues
  • Participating in peer-review process of the code-fix for the quality of deliverables

Confidential

ETL Developer

Environment: Ab Initio (GDE 3.1 with Co-Op 3.1.6), Enterprise Metadata Environment (EME), Oracle (11G) Teradata (14.1), UNIX shell scripting, Control-M, WinSCP, Quality Center, Internal File Transfer Utility, SQL Developer, Enterprise Change Management Tool, Peregrine Service Management Tool, HP Application Life-Cycle Management (ALM) Tool

Responsibilities:

  • Designing, Developing and Documenting the high-level conceptual Data process design for Confidential business applications.
  • Discussion with Business Team to gather requirements and prepare technical design documents, change orders, project timelines and deadlines.
  • Developing complex graphs using various components to read and extract data from Multi source platforms Like XML, Mainframes, Oracle database, HDFS and flat files.
  • Use Metadata Importer for importing metadata from an EME Technical Repository, Oracle database and SAP business Objects.
  • Creating UNIX wrapper scripts to run the graph and develop graphs to read, extract, validate, transform, format and edit the data according to the business requirements.
  • Performance tuning the complex Ab Initio graph, worked with tuning the memory parameter and Ab Initio best practice approach to implement the ETL process.
  • Analyzing and Optimize the method of transforming existing data into another format according to the requirements of new environment and load the data into other database structures.
  • Performing code reviews, optimization practices and demonstrate the entire application design and functionality to business end users.
  • Handling and resolving the daily tickets and incidents raised in JIRA and ServiceNow and resolve them within the Service Level Agreement.
  • Schedule batch jobs in Autosys Scheduler by interacting with Scheduling and Cross functional teams to get the timings and Watch list.
  • Monitoring the batch jobs and Continuous flows on daily basis and provide support if there is any failure in the script or graph by analyzing and backtracking the root cause for the failure and fix it.

We'd love your feedback!