Senior Software Developer Resume
4.00/5 (Submit Your Rating)
Alpharetta, GA
SUMMARY
- 16+ years of experience in T - SQL scripting, Data Analysis, ETL routines, processes, coding, mapping, and tool configurations.
- Azure Cloud(Azure Data Factory, Azure Storage), Google Cloud Platform(BigQuery, Data Studio, google storage and google sheets) & Amazon Web Services (Glue, Athena, Redshift, RDS, Lambda, S3 bucket, and using Python to manipulate and aggregate data within the Amazon Glue Canvas)
- Quality focused IT professional with experience as an Integration Consultant at Logility and previously a Lead ETL Engineer at Confidential .
- Experience in developing SSIS Packages, informatica mappings and ETL processes to extract, transform and convert different forms of data from one form to another.
- Proven ability in creating T-SQL codes and delivering ETL data solutions that meet corporate objectives and demands.
- Comfortable operating with a wide range of different ETL tools and platforms.
- Experience with both sql and Oracle
- Effective communicator; able to explain complex data analysis/logic in an easy-to-understand language.
- Communicated my data findings to the business Information Analysts, Data Acquisition Specialists, Auditors and Clients.
- 15 years working with Informatica, SSIS and SQL server
- Skilled in proactive identification and resolution of critical tools/systems/network issues.
PROFESSIONAL EXPERIENCE
Confidential - Alpharetta, GA
Senior Software Developer
Responsibilities:
- worked in Azure Cloud, specifically worked in ADF Azure Data Factory, I created ADF pipelines to replace SSIS packages and numerous sql stored procedures. We use SQL Insert statements for majority of our ETL routines, and for almost 6 months, I converted those ETL routines into ADF pipelines, added indexes to the tables and improved the efficiency and performance of the ETL processes.
- Working with dynamic SQL, Created Stored procedures to aggregate healthcare data, specifically cancer related data, created sql tables to store historic data, created tabular models to aggregate and apply additional complex logics to the data, then created power BI dashboards to visualize the data thru the Tabular models. Maintained existing ETL processes
- Currently working on a DATA ROBOT project, creating power shell scripts that use API calls to load and extract data from the data robot(already built model that is used to determine critical information about Cancer Patients), loading both training and scoring data and then receiving predictions from the model. Developed complex sql logic s to aggregate the data before loading into Data Robot.
- Worked in Google Bigquery, creating data sets, creating and scheduling queries, created a complete Project with numerous tables that were used to source multiple data studio dashboard.
- Created ETL routines in Pentaho Data Integration suite, ETL routines that moved data from MongoDB to Google Bigquery, and also etl routines that moved data from both Oracle and SQL server to Google Bigquery, then from there created Data studio dashboards for data visualization.
- Created and managed ETL routines in Google Bigquery, using very few lines of API calls, created scheduled ETL routines that will download live data from a website/ERP system, and load that data into BigQuery.
Confidential
Lead ETL Engineer
Responsibilities:
- Used AWS Glue and Athena to structure and load S3 files into RDS or Red Shift.
- Using AWS Glue, we crawled thru data files in S3 buckets, joining and filtering the data from different buckets into databases in RDS or Red Shift.
- Managed an Amazon RDS environment, storing historical data, and managing the permissions to that data as required by the business.
- Stored extensive amount of data using Red Shift, using AWS glue to dynamically load and manage different kinds of data with different schemas/structures into our Amazon Redshift data warehouse
- Used Jira to manage different projects, tasks and tickets.
- ETL consultant for most of the projects in different parts of the organization
- Building new data warehouses and maintaining/modifying existing ones.
- ETL data processes using different ETL tools like SSIS, INFORMATICA, e.t.c
- Used Informatica PowerCenter to design ETL mappings that will extract data from Oracle and SQL data warehouses into other data warehouses.
- Used Informartica designer to design processes that handled type 1, 2 & 3 slowly changing dimensions.
- Used Jira to manage different projects, tasks and tickets.
- Used SQL as the data warehouse, and SSIS as the ETL tool, to extract, manipulate and load data.
- Managed a team of three IT developers.
- Writing import/export programs that interpret data in HL7 messages before updating the target databases.
- Reviewing and analyzing HL7 messages for business intelligence and reporting
- Worked with Bootstrap
- Analyzing and reporting on HL7 log files
- Drafting and/or evaluating HL7 specs
- Building and monitoring HL7 interfaces between clinical applications
Confidential
Integration Consultant
Responsibilities:
- Supply Chain Planning - while analyzing building material (BOM) details, I worked with a group of consultants to create processes that will help our clients to build an effectively managed supply chain for their business.
- Worked with an in-house ETL tool called AdapLink to extract/load data to/from an SQL/Oracle data warehouse. using Informatica Powercenter, created mappings that loaded data from a client s SQL data warehouse and ERP system into our staging database, applying complex transformations and business logic within the ETL process.
- Created ETL processes in Datastage 11.5, based on business requirements, I created an ETL process that loaded data from flat files into SQL tables and from sql tables into sql tables.
- Used a reporting/planning tool called LVS to analyze historical data.
- Worked with a team of engineers/consultants to translate business requirements into ETL data transformation logics.
- Completed full end to end integrations with multiple clients.
- Conducted technical AdapLink trainings to both Clients and our new Consultants.
- Worked directly with Clients to resolve different SEV1 integration support issues
- Was on the Technical Support team from 01/2018 thru 08/2018, client facing, worked directly with the clients and our engineers to resolve urgent data processing failures, to get them resolved as quickly as possible.
- I created training materials on our ETL application (AdapLink), conducted trainings for both clients and our new ETL Integration associates.
- Travelled to client sites while conducting application training
- As an integration Consultant, I worked with our implementation consultants to create detailed business requirements, then worked with our Integration Engineering team to create the most effective solution based on the business requirements.
- Conducted monthly data maintenance within Salesforce
- Web Development using Java Hibernate, HTML, CSS.
- Based on business requirements, I used Microstrategy to create complex logic reports for the planning team to visualize some of their important data sets.
- While working on a new integration for IPG(a client of Logility) I used Google Cloud Platform s BigQuery to query data, create new schema/tables off existing data, and orchestrate an ETL process that will dissect, aggregate and produce cleaner/useful data for business intelligence.
- Over the last 24 months, I worked on end to end integration for clients like Starbucks, Coca-cola, Ericsson, Intertape Polymer Group, ALDO, e.t.c. I was involved on hands on integration of different client data/applications into our supply chain planning application.
- SAP HANA as an ERP source system, I implemented multiple integrations within a span of 18 months, loading data from SAP into SQL and Oracle data warehouse, then doing some sappy chain planning on the end data.
- Worked with different technologies/software to be able to complete some complex integration requests, for example WEB Services, API Calls, POSTMAN, XMLs, SQL objects like VIEWs and Stored Procedures, batch files, task schedulers, database management e.t.c
Confidential
ETL Developer/ BI Developer / Informatica and SQL Server Developer
Responsibilities:
- Using SSIS and informatica as the ETL tools, I converted raw transaction data into SQL tables, using sql stored procedure and sql scripts, I cached SQL table data into microstrategy cubes, these cubes are then accessed by reports that are used by our business analysts and accounting department to analyze daily transactions.
- I was hired as one of the contractors on a brand new team to create a new data model, a new sql data warehouse, update/create new ETL routines in informatica and SSIS, and to do some thorough data cleansing
- Currently converting SQL queries in the source qualifier into actual informatica mappings by pulling in all the sources into the mapping. Updating old mappings to improve data load efficiency.
- Created brand new mappings in Designer, by first creating an ODBC connection source to an SQL or Oracle data warehouse, then pulling in those sources into Informatica designer, and then creating mapping that uses data transformations in Informatica to convert the date fields and some of the amount fields to the required data types, then loading that data to another SQL or Oracle tables with a different ODBC connection. Monitoring the job and also verifying that the data was loaded correctly by adding Statistic option to the mapping.
- Designed ETL processes with Informatica, supported pre - designed informatica ETL processes.