Sample Template Example of Beautiful Excellent Professional Curriculum Vitae / Resume / CV Format with Career Objective, Job Description, Skills & Work Experience for Freshers & Experienced in Word / Doc / Pdf Free Download
Wilmington Finance, PA Jan’07 to Sep’08
Certifications:
Download Resume Format
Ruby
ruby@gmail.com
205-510-9876
PROFESSIONAL SUMMARY:
ü
7 + years of experience and 5
years of Teradata development and design of ETL
methodology for supporting data transformations & processing in a corporate
wide ETL Solution using Teradata
TD12.0/TD13.0, Ab Initio, administration, analyzing business needs of
clients, developing effective and efficient solutions and ensuring client
deliverable within committed deadlines.
ü
Proven track record in planning, building, managing successful
large-scale Data Warehouse and decision support systems. Comfortable with both
technical and functional applications of RDBMS, Data Mapping, Data
management, Data transportation and Data Staging.
ü 4+ years OLTP, ODS and EDW data modeling (logical and physical
design, and schema generation) using Erwin,
ER/Studio and other tools for Teradata
ü 5+ years administering large Teradata database system in
development, staging and production.
ü
Expert Developer skills
in Teradata RDBMS initial Teradata DBMS environment setup, development.
ü
Expert in using administrative
utilities like Archive/Restore,
Table Rebuild, Check Table, Configuration, Reconfiguration, Filer, DIP. ),
OLAP, OLTP, ETL and BI.
ü
Strong hands on experience
using Teradata utilities (FastExport,
MultiLoad, FastLoad, Tpump, BTEQ and QueryMan).
ü
Proficient in Teradata TD12.0/TD13.0 database design (conceptual and physical), Query optimization, Performance Tuning.
ü Over 4+ years of industry experience in developing strategies for ETL (Extraction, Transformation and
Loading) mechanism using Ab Initio
tool in complex, high volume Data Warehousing projects in both Windows and UNIX.
ü
Strong hands on experience with
Ab Initio GDE (3.0/1.15/1.14/1.13), Co>Op (3.0/2.15/2.14/2.13/2.12/2.11).
ü Expert knowledge in using various Ab Initio components such as Join, Reformat, Scan, Rollup, Normalize,
De-normalize, Partitioning and De-partitioning components etc.
ü Well versed with various AB Initio parallelism techniques and
implemented Ab Initio Graphs using Data
parallelism and MFS techniques.
ü Expertise in testing the
Ab Initio graphs and Teradata jobs by using JILs and scheduling in Autosys.
ü Configured Ab Initio environment to talk to database using db
config, Input Table, Output Table, Update table Components.
ü Developed various UNIX
shell scripts to run Ab Initio and Data base jobs
ü
Expert in designing Star Schema
and well versed with UNIX shell wrappers,
KSH and Oracle PL /SQL programming.
ü
Good Knowledge in Dimensional
Data modeling, Star/Snowflake schema
design, Fact and Dimensional tables, Physical and Logical data modeling.
ü
Experience with business
intelligence reporting tools using Business
Objects, Cognos and Hyperion.
ü
Experience in supporting large
databases, troubleshooting the problems.
ü
Experience in all phases of
SDLC like system analysis, application design, development, testing and
implementation of data warehouse and non-data warehouse projects
EDUCATION: Bachelor’s of Engineering in JNTU
University , Hyderabad , India
PROFESSIONAL EXPERIENCE:
WELLS
FARGO BANK, NC Jan
‘11 – Till date
Teradata Developer
Responsibilities:
ü Development of scripts for loading
the data into the base tables in EDW and to load the
data from source to staging and staging area to target tables using FastLoad, MultiLoad and BTEQ
utilities of Teradata.
ü Writing scripts for data
cleansing, data validation, data transformation for the data coming from
different source systems.
ü Performed application level DBA activities creating tables, indexes, monitored and tuned Teradata BETQ scripts using Teradata Visual
Explain utility.
ü Written complex SQLs
using joins, sub queries and correlated sub queries. Expertise in SQL Queries for cross verification of
data.
ü
Developed the Teradata
Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging into
Base tables
ü Performed Space Management
for Perm & Spool Space.
ü Reviewed the SQL for missing
joins & join constraints, data format issues, mis-matched aliases, casting errors.
ü Developed procedures to populate the customer data warehouse with
transaction data, cycle and monthly summary data, and historical data.
ü Dealt with initials, delta and Incremental data as well Migration
data to load into the Teradata.
ü Analyzing data and implementing the multi-value compression for optimal usage of space.
ü Query Analysis using Explain for unnecessary product
joins, confidence factor, join
type, order in which the tables are joined.
ü Very good understanding of Database
Skew, PPI, Join Methods and Join
Strategies, Join Indexes including sparse, aggregate and hash.
ü
Used
extensively Teradata Analyst Pack such as Teradata Visual Explain, Teradata
Index Wizard and Teradata Statistics Wizard.
ü Used extensively Derived Tables, Volatile Table and GTT
tables in many of the ETL scripts.
ü Tuning of Teradata
SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc…
ü Flat files are loaded into databases using FastLoad and then used in the queries to do joins.
ü Use SQL to query the databases and do as much crunching as possible
in Teradata, using very complicated SQL Query
optimization (explains plans, collect statistics, data distribution across AMPS, primary and secondary indexes,
locking, etc) to achieve better performance
ü Use PMON, Teradata
manager to monitor the production system during online day.
ü Excellent experience in performance
tuning and query optimization of the Teradata SQLs.
ü Developed mappings in Ab
Initio to load the data from various sources using various Ab Initio
Components such as Partition by Key,
Partition by round robin, Reformat, Rollup, Join, Scan, Normalize, Gather,
Merge etc.
ü Created checkpoints, phases to avoid dead locks and tested the
graphs with some sample data then committed the graphs and related files into
Repository from sandbox environment. Then schedule the graphs using Autosys and
loaded the data into target tables from staging area by using SQL Loader.
ü Implemented Data parallelism
by using Multi-file System, Partition
and De-partition components and also preformed repartition to improve the
overall performance
ü Developed graphs separating the Extraction,
Transformation and Load process to improve the efficiency of the system.
ü Involved in
designing Load graphs using Ab Initio and Tuned
Performance of the queries to make the load process run faster.
ü Extensively used
Partition components and developed graphs using Write Multi-Files, Read Multi-Files, Filter by Expression, Run Program, Join,
Sort, Reformat, and Dedup.
ü Used Data profiling task
to identify problems in the data that have to be fixed.
ü Performed validations, Data
Quality checks and Data profiling
on incoming data.
ü Used Enterprise Meta Environment (EME) for
version control, Control-M for
scheduling purposes.
ü Used AIR commands to do
dependency analysis for all ABI objects
ü Testing and
tuning the Ab Initio graphs and Teradata SQL’s for better performance
ü Developed UNIX shell scripts to run batch jobs in Autosys and loads into production.
ü Interaction with different teams for finding failure of jobs running
in production systems and providing solution, restarting the jobs and making
sure jobs complete in the specified time window.
ü Provide 24*7 production support for the Teradata ETL jobs for daily, Monthly and
Weekly Schedule.
INTEL CORPORATION, OR Aug’09
to Dec ‘10
Teradata Developer/DBA
Responsibilities:
ü Involved in Requirement gathering, business Analysis, Design and
Development, testing and implementation of business rules.
ü Developed mappings to load data from Source systems like oracle, SAP BI, SQL SERVER to Data
Warehouse.
ü Writing MultiLoad scripts,
FastLoad and BTEQ scripts for loading the data into stage tables and then
process into BID.
ü Dealt with Incremental data as well Migration data to
load into the Teradata.
ü Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and
even distribution of data across all the available AMPS. Considering both the
business requirements and factors, created appropriate Teradata NUSI for smooth
(fast and easy) access of data.
ü Worked on exporting data to flat files using Teradata FastExport
ü Analyzed the Data
Distribution and Reviewed the Index choices
ü In-depth expertise in the Teradata cost
based query optimizer, identified potential bottlenecks
ü Worked with PPI Teradata tables
and was involved in Teradata
specific SQL fine-tuning to increase performance of the overall ETL process
ü Debugging and monitoring the code
using GDB commands
ü Developed procedures to populate the customer data warehouse with
transaction data, cycle and monthly summary data, and historical data.
ü Involved in designing the data flow diagram. Documented the mappings
used in ETL processes
ü Involved in Peer Reviews
ü Participated in Logical and Physical Data modeling. Identified the
Entity types, their attributes and the relationship between the Entities in the
organization’s business process.
ü Extensively used UNIX Shell Scripting for writing SQL
execution scripts in Data Loading Process.
ü Written SQL scripts which
are used in Database Components of Ab Initio to extract the data from different
source tables and to load the target table using Update Table and Output Table
components with the support of Config
(.cfg) file in graphs.
ü Used Ab Initio components like Reformat, Input file, Output file,
Join, Sort, Partition By key, Normalize, Input Table, Output Table, Update
Table, Gather Logs and Run SQL for developing graphs
ü Worked on Data enrichment and Data Standardization for Quality check
data coming from various source systems (i.e., Oracle, Flat files.)
ü Sorted the Extraction from heterogeneous source systems, like Oracle, internal and external flat
files and building of the Transformations and Loading formatted data into the
Multi-file and Serial files during the intermediate and the final stages of the
processes (ETL) using Ab Initio.
ü Used Enterprise Meta
Environment (EME) for version control.
ü Executed the test scripts and unit
testing of EDW/DM using
Co>Operating system.
ü Involved on the creation of IVR
application to support the authentication of existing customers.
ü Used Ab Initio GDE to
create graphs for the generation of the Loan Level file without summaries and
to summarize loan-level data records.
ü Created UNIX Shell
scripts (wrapper scripts) to be invoked using Autosys.
ü Extensively used the Ab Initio tool’s feature of Component, Data and
Pipeline parallelism.
ü Configured the source and target database connections using .dbc
files
ü Generated DB configuration files (.dml, .cfg) for source and target
tables and modified them according to the requirements.
ü Used sandbox parameters
to check in and checkout of graphs from repository Systems.
First Data, NE Oct’08 to July’09
Teradata/ETL Developer
Responsibilities:
ü
Used Ab Initio as ETL tool to
pull data from source systems, cleanse, transform, and load data into
databases.
ü
Involved in creating high level
design and detailed design documents for Ab Initio graphs.
ü
Extensively involved in Ab Initio Graph Design, development and
Performance tuning.
ü
Worked on the sales Data Mart for moving the enterprise
data from a queue system using Continuous
Flows.
ü
Involved in Unit testing (UAT) System testing and debugging during testing phase and used
Test Director to keep track of bugs.
ü
Updated and inserted
transactional data according to the business changes using Continuous Flows.
ü Developed number of Ab Initio Graphs based on business requirements
using various Ab Initio Components such as Partition
by Key, Partition by round robin, Reformat, Rollup, Join, Scan, Normalize,
Gather, Merge etc
ü Developed complex mappings
using multiple sources and targets in different databases, flat files.
ü Developed BTEQ scripts for Teradata.
ü
Automated Workflows and BTEQ
scripts using scheduling tool Cronacle.
ü
Responsible for tuning the
performances of Informatica mappings and Teradata BTEQ scripts.
ü Used Repository Server
Administration Console to create and backup Repositories.
ü Worked with DBAs to tune the
performance of the applications and Backups.
ü Writing UNIX Shell Scripts
for processing/cleansing incoming text files.
ü Used CVS as a versioning
tool.
ü Performed Unit testing,
Integration testing and generated various Test Cases.
ü Performed Data analysis and Data
validations.
ü Migrated data with help of Teradata FastExport, Insert/ Select,
flat files from one system to another.
ü Performance tuning, monitoring and index selection while using PMON,
Teradata
Dashboard, Statistics wizard and Index wizard and Teradata Visual Explain to see the flow
of SQL queries in the form of Icons to make the join plans more effective and
fast.
ü Extensively used Teradata
Manager, Teradata
Query Manager, and Teradata
administrator etc. to manage system in prod, test, and development
environments.
ü Providing suggestions for best join plans while visualizing SQL
queries with Visual Explain and Explain while recommending best Join Indexes
such as Single table or multi table join indexes.
Teradata/ETL Developer
Responsibilities:
ü Involved in understanding the Requirements of the End
Users/Business Analysts and Developed Strategies for ETL processes.
ü The project involved extracting
data from various sources, then applying the transformations before loading the
data into target (warehouse) Stage tables and Stage files.
ü Created the mappings using transformations
such as the Source Qualifier, Aggregator, Expression, Lookup, Router, Filter,
and Update Strategy.
ü Used the following components
of Ab Initio in creating graphs.
Dataset components (Input file,
output file, lookup file, and intermediate file), Database components (Input table, output table, RunSql, Truncated
Table), Transform Components (Aggregate,
Dedup Sorted, Filter by Expression, Join, Normalize, Reformat, Rollup and Scan
Components), Partitioning Components (Broad
Cast, partition by expression, partition by key, partition by round robin),
Gatherlogs, Redefine format, Replicate, Runprogram components.
ü Extensively used the Ab
Initio tool’s feature of Component, Data and Pipeline parallelism.
ü Configured the source and target database connections using .dbc files
ü Used BTEQ and SQL Assistant (Query man) front-end
tools to issue SQL commands matching the business requirements to Teradata
RDBMS.
ü Implemented star-schema models for the above data marts. Identified the grain for the fact table. Identified
and tracked the slowly changing
dimensions and determined the hierarchies
within the dimensions.
ü Worked with DBA team to ensure implementation of the databases for
the physical data models intended
for the above data marts.
ü Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and
even distribution of data across all the available AMPS. Considering both the
business requirements and factors, created appropriate Teradata NUSI for smooth
(fast and easy) access of data.
ü Adapted Agile software
development methodology to ETL the above data marts.
ü Did the performance tuning
for Teradata SQL statements using Teradata Explain
command.
ü Created .dml files
for specifying record format.
ü Extensively worked under the UNIX Environment using Shell Scripts.
HDFC Bank , India
Aug’04 to Dec’06
Teradata/ETL Developer
Responsibilities:
ü Detailed analysis of user’s business requirements
ü Tuning of Teradata
SQL statements using Explain analyzing the data distribution among AMPs and
index usage, collect statistics, definition of indexes, revision of correlated
sub queries, usage of Hash functions, etc…
ü Flat files are loaded into
databases using FastLoad and then used in the queries to do joins.
ü Used Teradata SQL with BTEQ scripts to get
the data needed
ü Developed new reports for the
end-users and maintained current ones. All the DSS and the Legacy systems are
stored in Teradata
Tables
ü Use SQL to query the
databases and do as much crunching as possible in Teradata, using very complicated SQL
ü Development of BTEQ /FastLoad
/MultiLoad scripts for loading purpose
ü Extensive use of Teradata
Utilities (BTEQ, MLOAD, FASTLOAD, FASTEXPORT)
ü Analyze root causes for
failed batch jobs
ü Development of Ab Initio
graph.
ü Involved in the preparation of Design
documents.
ü Involved in the Code review and fixing problems during Code review.
ü Developed Unix Korn Shell
wrappers to run Ab Initio Scripts.
ü Worked closely with Ab initio Administrator for Implementing the Multi
File Systems (MFS).
ü Redesigned existing graphs
to bring down processing time.
ü Converted user defined
functions of business process into Ab Initio user defined functions
ü Developed UNIX shell scripts
to generate xfrs, DMLs, config files, sqls.
Technical Skills:
Teradata Tools
|
ARCMAIN, BTEQ, Teradata SQL
Assistant, Teradata
Manager, PMON, Teradata
Administrator.
|
ETL Tools
|
Ab Initio (GDE3.0/1.15/1.14/1.13)
Co-op 3.0/2.15/2.14/2.13/2.12/2.11),Informatica
Power Center 6.2/7.1/7.1.3, SSIS
|
DB Tools
|
SQL*Plus, SQL Loader, TOAD 8.0,
BTEQ, Fast Load, Multiload, FastExport, SQL Assistant, Teradata
Administrator, PMON, Teradata
Manager
|
Databases
|
Teradata 13.0/12.0/V2R6.2/V2R5, Oracle 10g, DB2, MS-SQL Server
2000/2005/2008, MS-Access, Oracle
|
Scheduling Tools
|
Autosys,
|
Version Control Tools
|
Clear Case, TFS
|
Programming Languages
|
C, C++, Java, J2EE, Visual
Basic, SQL, PL/SQL and UNIX Shell Scripting
|
Data Modelling/Methodologies
|
Logical/Physical/Dimensional,
Star/Snowflake, ETL, OLAP, Complete Software Development Cycle. ERWIN 4.0
|
Operating Systems
|
Sun Solaris
2.6/2.7/2.8/8.0,Linux, Windows, UNIX
|
Certifications:
ü Teradata Certified Professional.
ü Teradata Certified SQL Specialist.
ü Teradata Certified Administrator.
ü Teradata Certified Application Developer.
Download Resume Format
0 comments:
Post a Comment