We provide IT Staff Augmentation Services!

Software/bi Developer Resume

2.00/5 (Submit Your Rating)

El Segundo, CA

SUMMARY

  • Around 9+ years of IT experience in Data warehousing with emphasis on Business Requirements Analysis, Application Design, Development, coding, testing, implementation and maintenance of client/server Data Warehouse and Data Mart systems.
  • Expert knowledge on SQL Server, T - SQL, Data/Dimensional Modeling, MSBI Tools-SSIS/SSAS/SSRS, Python, Hive, Teradata, Vertica, AEM, Azure data factory, Azure data bricks and shell scripting, Power BI Reporting.
  • Well experienced in design documentation and implementation proposal for extraction of data from data warehouse.
  • Experience in Design and Development of ETL methodology for supporting Data Migration, data transformations & processing in a corporate wide ETL Solution using Teradata.
  • Worked extensively with Dimensional modeling, Data migration, Data cleansing, ETL Processes for data warehouses.
  • Expertise in SQL Server and T-SQL (DDL, DML and DCL) in constructing Tables, Joins, Indexedviews, Indexes, Complex Storedprocedures, Triggers, and useful functions to facilitate efficient data manipulation and consistent data storage according to the business rules.
  • Experience managing Big Data platform deployed in Azure Cloud
  • Implemented Copy activity, Custom Azure Data Factory Pipeline Activities for On-cloud ETL processing
  • Experience in Monitoring and Tuning SQL Server Performance
  • Experience in Data modelling, Database design, normalization techniques using Database diagrams, Erwin, and other Data modeling tools
  • Expert in creating, debugging, configuring, and deploying ETL packages designed MS SQL Server Integration Services (SSIS)
  • Experience in configuration of report server and report manager for job scheduling, giving permissions to different level of users in SQLServer Reporting Services (SSRS)
  • Experience in creating OLAP Data marts using SQL Server Analysis Server (SSAS)
  • Strong development skills with Azure Data Lake, Azure Data Factory, SQL Data Warehouse Azure Blob, Azure Storage Explorer
  • Strong technical knowledge in SQL Server BIsuite (ETL, Reporting, Analytics, Dashboards) using SSIS, SSAS, SSRS and various 3rd party tools.
  • Involvement in all phases of SDLC (Systems Development Life Cycle) from analysis and planning to development and deployment.
  • Experience in developing solutions by analyzing large data sets efficiently.
  • Good understanding of Data Mining and Machine Learning techniques.
  • Good Experience with databases, writing complex queries and stored procedures using SQL and PL/SQL.
  • Expertise in writing Shell-Scripts, Cron Automation and Regular Expressions.
  • Experience in scheduling Sequence and parallel jobs using Data Stage Director, UNIX scripts and scheduling tools.
  • Experience in identifying and resolve ETL production root cause issues.
  • Worked on technologies like Hadoop, AEM and Java.
  • Profound knowledge about the architecture of the Teradata database and experience in Teradata Unloading utilities like Fast export.
  • Experience in maintenance, enhancements, performance tuning of ETL code.
  • Involved in Unit testing, System testing to check whether the data loads into target are accurate.
  • Good working experience in writing SQL and PL/SQL scripts including views and materialized views.
  • Expertise in using configuration management tool like Sub Version (SVN), Rational Clear case, CVS and Git for version controlling.
  • Expert in Various Agile methodologies like SCRUM, Test Driven Development, Incremental and Iteration methodology and Pair Programming.
  • Experience in various stages of System Development Life Cycle (SDLC) and its approaches like Waterfall & Agile Model.
  • Implemented and followed a Scrum Agile development methodology within the cross functional team and acted as a liaison between the business user group and the technical team.
  • Involved in all phases of Software Development Life Cycle (SDLC) in large scale enterprise software using Object Oriented Analysis and Design.
  • Highly motivated team player with zeal to learn new technologies.
  • An excellent team member with an ability to perform individually, good interpersonal relations, strong communication skills, hardworking and high level of motivation & Ability to work effectively while working as a team member as well as individually.

TECHNICAL SKILLS

Languages: SQL, PL/SQL, JAVA, .NET, UNIX

Data Modeling: Star Schema Modeling, Snow Flake Modeling, Erwin and Dimensional modeling.

Operating systems: Windows, UNIX, LINUX.

Other Tools: Teradata SQL Assistance, Putty and SQL Developer.

Scripting: UNIX shell scripting, Batch Script, FTP

Process Models: Waterfall, Agile (Kanban Board) Models

PROFESSIONAL EXPERIENCE

Confidential, El Segundo, CA

Software/BI Developer

Responsibilities:

  • Involved in understanding the Requirements of the End Users/Business Analysts and Developed Strategies for ETL processes.
  • Used Teradata utilities FAST LOAD, MULTI LOAD, TPUMP to load data.
  • Wrote, tested and implemented Teradata Fast load, Multiload and BTEQ scripts, DML and DDL.
  • Managed all development and support efforts for the Data Integration/Data Warehouse team.
  • Implemented Copy activity, Custom Azure Data Factory Pipeline Activities.
  • Created C# applications to load data from Azure storage blob to Azure SQL, to load from web API to Azure SQL and scheduled web jobs for daily loads.
  • Experience in publishing the Power BI Desktop models to Power Bi Service to create highly informative dashboards, collaborate using workspaces, apps, and to get quick insights about datasets
  • Created notifications and alerts, subscriptions to reports in the Power BI service
  • Used R to extract the data from Cosmos and load it into SQL database
  • Provided on call support during the release of the product to low level to high level Production environment.
  • Used Agile methodology for repeated testing.
  • Worked with TWS and Control-M scheduling tool for jobs scheduling.
  • Involved in Unit testing, User Acceptance testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
  • Developed UNIX KORN Shell wrappers to initialize variables, run graphs and perform error handling.
  • Verified if implementation is done as expected i.e. check the code members are applied in the correct locations, schedules are built as expected, dependencies are set as requested.
  • Used Teradata utilities fastload, multiload, tpump to load data.
  • Analyzed the source data, made decisions on appropriate extraction, transformation, and loading strategies.
  • Done the impact assessment in terms of schedule changes, dependency impact, code changes for various change requests on the existing Data Warehouse applications that running in a production environment.
  • Performed tuning and optimization of complex SQL queries using Teradata Explain.
  • Created partitions, bucketing across state in Hive to handle structured data.
  • Implemented Dash boards that handle HiveQL queries internally like Aggregation functions, basic hive operations, and different kind of join operations.
  • Managing and scheduling batch Jobs on a Hadoop Cluster using Oozie.
  • Developed shell scripts to loading Flat files from various sources in to HDFS.
  • Data ingestion from Teradata to Hadoop (Sqoop imports). Perform validations and consolidations for the imported data.
  • Developed design documentation and implementation proposal for extraction of data from data warehouse.
  • Developed modules to extract, process & transfer the customer data using Teradata utilities.

Confidential, Fountain Valley, CA

Software/BI Developer

Responsibilities:

  • Involved in understanding the Requirements of the End Users/Business Analysts and Developed Strategies for ETL processes.
  • Used Teradata utilities FAST LOAD, MULTI LOAD, TPUMP to load data.
  • Managed all development and support efforts for the Data Integration/Data Warehouse team.
  • Wrote, tested and implemented Teradata Fast load, Multiload and BTEQ scripts, DML and DDL.
  • Prioritized requirements to be developed according to Agile methodology.
  • Transformed Logical Data Model to Physical Data Model ensuring the Primary Key and Foreign key relationships in PDM, Consistency of definitions of Data Attributes and Primary Index considerations.
  • Worked on the Teradata stored procedures and functions to confirm the data and have load it on the table.
  • Developed procedures to populate the customer data warehouse with transaction data, cycle and monthly summary data, and historical data.
  • Worked on optimizing and tuning the Teradata views and SQL’s to improve the performance of batch and response time of data for users.
  • Worked closely with analysts to come up with detailed solution approach design documents .
  • Provided support during the system test, Product Integration Testin g and UAT.
  • Verified if implementation is done as expected i.e. check the code members are applied in the correct locations, schedules are built as expected, dependencies are set as requested.
  • Provided quick production fixes and proactively involved in fixing production support issues.
  • Involved in complete software development life-cycle (SDLC ) including requirements gathering, analysis, development, testing, implementation and deployment .
  • Coordinate with Configuration management team in code deployments.

Confidential, New York, NY

Software/BI Developer

Responsibilities:

  • Worked with complex SQL queries to test the data generated by the ETL process against the target database.
  • Worked on the Teradata stored procedures and functions to confirm the data and have load it on the table.
  • Worked closely with analysts to come up with detailed solution approach design documents .
  • Used SQL Assistant to querying Teradata tables .
  • Done the impact assessment in terms of schedule changes, dependency impact, code changes for various change requests on the existing Data Warehouse applications that running in a production environment.
  • Provided quick production fixes and proactively involved in fixing production support issues.
  • Have strong knowledge in data mover for importing and exporting of data.
  • Ensuring that Quality Assurance test plans are executed before releasing product enhancements for user acceptance testing.
  • Created BTEQ scripts with data transformations for loading the base tables. Worked on optimizing and tuning the Teradata SQL to improve the performance of batch and response time of data for users.
  • Fast Export utility to extract large volume of data and send files to downstream applications.
  • Provided performance tuning and physical and logical database design support in projects for Teradata systems.
  • Preparation of Test data for Unit testing and data validation tests to confirm the transformation logic.

We'd love your feedback!