Saturday, 27 June 2009

Oracle Apex Hosting



Long time now, I have been looking for a web hosting company, where I could host my Oracle Apex pages in full and economically, and found one for £3.50 a month!


Oracle's free http://apex.oracle.com hosting platform is not enough as it doesn't allow you to make external network calls, i.e. calls to Google services like Google Maps.

If you want to create an Oracle Apex application which uses Google Maps or even use internet able PL/SQL packages such as UTL_HTTP on the internet, you have to look elsewhere as the free Oracle Apex hosting platform doesn't let you do that.

But don't worry because the Application Outsource UK LTD platform for £3.50 a month gives you 10MB storage and full Oracle Apex hosting with the ability to do external network calls!

I can't say how pleased I am with their service. Now I am able to do everything I couldn't do with Oracle's Free Apex hosting platform. They have a great support website too.

Have a look at my Oracle Apex and Google Maps demo application I have crated using their platform here on My Oracle Apex and Google Maps samples.

Yahoo, Google API and Oracle Apex

Google maps API and Oracle Apex



More Reading

A good starting document on Integrating Google Maps and Oracle Apex. This document gave me the inspiration to try Google Maps with Oracle Apex.

1. Integrating_Application_Express_with_Google_Maps.pdf.

2. Google API

Monday, 18 May 2009

Update one table from another using ORACLE MERGE INTO

Sometimes, during ETL or when is necessary to do data cleansing, I need to update a table with data from another table. I have been looking for scripts to do this easily and I have come across MERGE INTO. A very easy to use and understand SQL statement.


Assume we have 2 tables as:

create table t1 
(
id number primary key not null,
name varchar2(50)
);


create table t2
(
id number primary key not null,
job varchar2(50)
);

insert into t1 values (1, 'Kubilay');
insert into t1 values (2, 'Robin');
insert into t1 values (3, 'Kenny');

select * from t1;


ID                     NAME                                               
---------------------- -------------------------------------------------- 
1                      Kubilay                                            
2                      Robin                                              
3                      Kenny                                              

3 rows selected




insert into t2 values (1, 'DBA');
insert into t2 values (2, 'SA');
insert into t2 values (3, 'Developer');

select * from t2;


ID                     JOB                                                
---------------------- -------------------------------------------------- 
1                      DBA                                                
2                      SA                                                 
3                      Developer                                          

3 rows selected




And now we want to create a new column in t1 and move the data which is matching (primary key to primary key) from table t2 to table t1.

alter table t1 add job varchar2(60);




Use MERGE INTO to update table t1 with data from table t2.

merge into t1
using t2
on (t1.id = t2.id)
when matched
then update set 
t1.job = t2.job;


select * from t1;

ID                     NAME                                               JOB                                                          
---------------------- -------------------------------------------------- ------------------------------------------------------------ 
1                      Kubilay                                            DBA                                                          
2                      Robin                                              SA                                                           
3                      Kenny                                              Developer                                                    

3 rows selected




As you can see in the link you can use the MERGE statement to insert new rows like UPSERT when data in table t2 doesn't match with t1. That is, an Oracle UPSERT statement!

Thursday, 9 April 2009

Upgrading to 11g, a technical workshop seminar by Mike Dietrich

On 7th of April I have attended the Oracle 11g upgrade technical workshop seminar given by Mike Dietrich in the London City offices of Oracle.

This was one of the best Oracle Events I have attended.

Mike run us through his 400+ slides of tips & tricks and best practices when upgrading to Oracle 11g. He also presented 3 real life cases in which he helped 3 big Oracle customers during their upgrade to Oracle 11g. The contents of the slides are very good and the presentation skills of Mike were excellent. In this post there is a link to the slides. The slides are quite big in size so you have to download them in 3 parts (about 8MB in total). Here are the slides: Oracle 11g Upgrade Workshop presentation


Workshop presentation highlights


  • Better support to the optimizer during upgrade.
  • Definitely apply the timezone patch, you can't upgrade without it.
  • Recalculate statistics before upgrade.
  • If you stay on the host use dbua.
  • If you move host use command line upgrade.
  • Provision for performance degradation, do tests, tests and tests.


It seems when it comes to upgrades in Oracle, people will always wait for R2. Mike said that Oracle is well aware of this and that with 11g they tried to break this taboo by explaining to us how Oracle actually are not introducing anything special as bug fixes in R2. People will see this themselves when they see how many little bugs will be fixed in R2.

From the presentation I also got the impression that you can upgrade to 11g, it is easy takes little time. But the performance implications the upgrade might introduce are still uncontrollable if you don't have diagnostic & tuning pack license purchased (£1777 each per CPU) . A Diagnostic packs license designed to show and sort out automatically all SQL statements which will regress! If you don't have diagnostic pack purchased, your DBA might end up dealing for days with hundreds of ''Regressed'' SQL statements because of the upgrade. Whereas the diagnostics package deals with it in hours. The DBA poor chap! (Hopefully Not!)

Thursday, 2 April 2009

"Informatica on Demand" an ETL tool running in the Cloud

Recently I was after ways which I could use to integrate (ETL) data from separate sources into the Cloud. Move data between MS SQL Server and Salesforce or MS SQL Server to Oracle for example. But mostly I was interested in an easy peasy way to migrate data from back-end database systems such as MS SQL Server into the Salesforce Cloud.

How is that possible? First thing which comes to mind is API, Java, Import & Export and custom code, code and code.

Not really, all you need is Informatica on Demand! A simple and intro level absolutely free ETL tool which is running in the Cloud!

Informatica is a very reputable company in the Data Integration field. Informatica on Demand is part of their free Software as a Service (SaaS) offering. An ETL tool which is itself in the Cloud. You don't have to install any software anywhere in your infrastructure, apart from a little Informatica agent which identifies your network to the Informatica Cloud. All you do is just create a login on their website and start working! How cool is that! You login to the Informatica on Demand website and configure and even schedule your data migration and ETL tasks between your databases and the Cloud as easy as checking your email!

This free tool does schema-to-schema migrations only. It doesn't allow you to migrate by writing custom SQL on the source window and loading its result set to the target. It is limited to table-to-table migrations between separate sources and targets. You can however, save your SQL as a view in your source database and the tool will see this as a table object.

In the same Informatica on Demand website and within the same Cloud infrastructure, they also have a suite of other more advanced tools such as Data Synchronization, Data Replication and Data Assessment. This stack of tools is not free. But a 30 day trial is still possible.

Overall, I found the tool quite easy to use and with few clicks, I could transfer thousands of rows from say Oracle to Salesforce and from Microsoft SQL Server to Oracle. No need to install and configure gateways, ODBC and no need tweak any hardware locally. It also has a scheduler which works and sends emails when tasks are complete and all this is still hosted and running in The Could. So you don't have to backup anything or look after things.

Here is a snapshot of the Field Mapping screen:

Tuesday, 17 February 2009

ER modeling Tools now on the Web!

I was looking at Google Apps Gallery and came across this ER modeling tool which is very cool and I decided to blog about it.


The tool is called GAE SQL Designer and is done by Jason W. Miller. It is an ER modeling tool on the web! All you need is a browser and you start modeling databases, with no installs and no extra software. You can access it directly here http://gaesql.appspot.com/

Excellent, I liked the idea very much! One thing that is missing is the ability to generate SQL from your data models. There is another less intuitive google app for that here: http://code.google.com/p/wwwsqldesigner.

Apart from that, I think Jason's GAE SQL Designer will be useful in quick prototyping and learning. Well done Jason!