Friday, 10 February 2017

Big Data and Analytics Collaboration

I suggest that companies need to make every effort to bring the IT and business users together to collaborate on best practice for your deployment. “It isn’t always easy but it’s critical.  Equally, while traditional systems are better suited to a ‘waterfall’ approach where technology is introduced all at once, that’s not so true of big data projects.  Through the collaboration between IT and the business, you need to focus on the goals and desired outcomes and introduce the technology in phases to reflect those.”


For the full article of experts sharing advice: http://hortonworks.com/blog/hortonworks-customers-share-advise-2017/

Image from: https://www.castlellc.com/collaboration.aspx

Fast Access Fact Tables - Oracle

Indexes, index organized tables (IOT) & in-memory tables

Introduction

There always seems to be more than one way to complete a task in the world of database and programming.  So if you have a large table requiring rapid access how do you perform a quick test to seek some direction?
In this post, I do just that.  I wanted to compare several options to prove my theory, and at least have something to propose as a solution to take forward into more rigorous performance testing.
In this article, I will compare six methods of holding a customer table:
  1. Heap table with a primary key index on Customer ID.
  2. Partitioned Heap table with a primary key index on Customer ID.
  3. Index Organized table with a primary key index on Customer ID.
  4. Partitioned Index Organized table with a primary key index on Customer ID.
  5. In-Memory Heap table with a primary key index on Customer ID.
  6. In-Memory Partitioned Heap table with a primary key index on Customer ID.
The In-Memory tests will only work in Oracle 12.1 and above, with the In-Memory option enabled.
The usage will entail a query against the Customer table on Customer ID where the full row is returned for analysis.  The query will look at one Customer ID at a time.
At this point, take a guess at which table will come out on top.

Method

Using a Virtual Machine, downloaded from the OTN Network and running in VitualBox, I will:
  1. In turn create six tables of the above structures.
  2. Insert 2 million rows to give enough volume to get some meaning, although this is still too small really.
  3. Run 800 queries against each table, each time selecting a Customer ID at random (dbms_random.value).
  4. Compare the Total, Average, Minimum and Maximum query times - excluding as much of the time as possible taken to record results etc.

The Tables

The statement below creates the table for Test 1 - Heap Table with a Primary Key Index
create table customer_profile
(
  customer_id       number not null,
  gender            varchar2(1),
  age               number,
  segment_id        number,
  profile_1         number,
  profile_2         number,
  profile_3         number,
  profile_4         number,
  profile_5         number,
  profile_6         number,
  constraint cutomer_profile_pk primary key (customer_id)
)
For Test 2, add the following syntax:
partition by hash (customer_id)
partitions 4
For Test 3, add the following to Test 1:
organization index
For Test 4, add the following to Test 1:
organization index
partition by hash (customer_id)
partitions 4
For Test 5, add the following to Test 1:
inmemory
For Test 6, add the following to Test 2:
inmemory

TEST DATA

If you have access to some good test data, use it.  Sadly I don't so I need to manufacture some test data.  The SQL is attached, but here is an extract of it with a couple of sample rows as an example:
create or replace procedure customer_profile_insert(number_of_rows in number, number_of_repeats in number default 1)
…
    insert /*+ append */ into customer_profile
      (customer_id, gender, age, segment_id, profile_1, profile_2, profile_3, profile_4, profile_5, profile_6)

    with v1 as (select rownum n from dual connect by level <= customer_profile_insert.number_of_rows) -- create n rows

    select (customer_profile_insert.number_of_rows * (indx-1)) + n customer_id, 
           case when round(sys.dbms_random.value(0,1)) = 0 then 'F' else 'M' end gender,
           round(sys.dbms_random.value(16,80)) age,
           round(sys.dbms_random.value(1,5)) segment_id,
           round(sys.dbms_random.value(1,25)) profile_1,
           round(sys.dbms_random.value(1,25)) profile_2,
           round(sys.dbms_random.value(1,25)) profile_3,
           round(sys.dbms_random.value(1,25)) profile_4,
           round(sys.dbms_random.value(1,25)) profile_5,
           round(sys.dbms_random.value(1,25)) profile_6
      from v1
      order by profile_6; -- randomise the customer_id order
…
CUSTOMER_ID G        AGE SEGMENT_ID  PROFILE_1  PROFILE_2  PROFILE_3  PROFILE_4  PROFILE_5  PROFILE_6
----------- - ---------- ---------- ---------- ---------- ---------- ---------- ---------- ----------
       1762 F         70          4         23         19         21          4         22          6
       1769 F         18          5          1         22          8          1         13         14

RUNNING THE TESTS

I created two procedures.  The first is the entry point to the test (customer_profile_select_group, and loops round 20 times to call another procedure that will query the customer table (customer_profile_select).  My apologies if the formatting isn't good, it seems to lose formatting when pasting into Wordpress!!
create or replace procedure customer_profile_select_group(number_of_iteractions in number, test_description in varchar2) 
as
  test_group number := test_group_seq.nextval;
begin

 for indx in 1 .. 20 loop
   customer_profile_select(customer_profile_select_group.number_of_iteractions,
   customer_profile_select_group.test_group, 
   customer_profile_select_group.test_description);
 end loop;
 
exception
   when others then
     dbms_output.put_line(dbms_utility.format_error_stack());
     dbms_output.put_line(dbms_utility.format_error_backtrace());
     dbms_output.put_line(dbms_utility.format_call_stack());
     raise;
end customer_profile_select_group;
create or replace procedure customer_profile_select(number_of_iteractions in number, test_group in number, test_description in varchar2) 
as
 customer_profile_record customer_profile%rowtype;
 max_customer_id customer_profile.customer_id%type;
 lookup_customer_id customer_profile.customer_id%type;
 test_seq_number test_timings.test_id%type := test_timings_seq.nextval;
 start_time test_timings.start_time%type;
 end_time test_timings.end_time%type;
begin

 -- get the highest customer_id so we can set the upper limit in the query.
 -- we know th lowest value is 1.
 select max(customer_id)
   into customer_profile_select.max_customer_id
   from customer_profile;
 
 for indx in 1 .. customer_profile_select.number_of_iteractions loop
 
   customer_profile_select.lookup_customer_id := round(sys.dbms_random.value(1, customer_profile_select.max_customer_id));
 
   start_time := systimestamp;
 
   select *
     into customer_profile_record
     from customer_profile
    where customer_id = customer_profile_select.lookup_customer_id;
 
   end_time := systimestamp;
 
 insert into test_timings
 (test_id, test_description, test_group_id, test_iteration, start_time, end_time)
 values
 (customer_profile_select.test_seq_number, customer_profile_select.test_description, 
 customer_profile_select.test_group, indx, customer_profile_select.start_time, 
 customer_profile_select.end_time);
 
 end loop; 

 commit;
 
 exception
 when others then
   dbms_output.put_line(dbms_utility.format_error_stack());
   dbms_output.put_line(dbms_utility.format_error_backtrace());
   dbms_output.put_line(dbms_utility.format_call_stack());
   raise;
end customer_profile_select;
We will call these procedures six times in total, and the results will be stored in a table called test_timings.  Note, I record the start and end time immediately before and after the call to the customer table to avoid the recording of time for other processing that I'm not interested in for the point of the test.

THE RESULTS

Well, I wasn't totally surprised by the results, but it always great when your theory and belief is backed up with some facts (albeit test results and not real life).

It is clear from the results that the 3rd scenario, IOT Table with a Primary Key index far out performs the others.  I have added a blue line at the point of average time this scenario and it's clear that the average is faster than the fastest time in all scenarios except for an Partitioned IOT, where the average time is slightly slower anyway.
In fact, the IOT (3) is approximately 63% faster than a normal Heap Table with an Index.
Please note - these tests may not hold true in your environment, so further testing is required.

Test limitation

Let's not get too excited, there are some limitations to the test:
  1. No concurrent queries – dedicated processing.
  2. Small set of test data.
  3. Manufactured test data.
  4. No gaps in Customer ID.
  5. Data forms a normal distribution.
  6. No parallel processing tests.

Hardware and database

Host Machine:
  • OS: OS X (El Captain), version 10.11.3
  • Processor: 2.8 GHz Intel I7 Quad Core
  • Ram: 16 Gb
  • Disk: 1TB Flash
Virtual Machine:
  • Oracle Virtual Box v5.0.2
  • OS: Oracle (64-bit) Linux
  • Processor: 2 CPU
  • Ram: 6 Gb
  • Disk Allocation: 200 Gb
Oracle Sample App V506 (OTN Network)
Database:
  • Oracle 12c v12.1.0.2 Enterprise Edition
  • Single Threaded

Full Test Script Below...


/*
    Customer Lookup Table Performance Tests
    
    These tests compare six table formats and the performance of looking up a 
    single ID at a time from the tables.
    
    Feel free to use, resuse and share this test script, there is nothing sensitive.
    
    Author: Zog Gibbens
    Date:   January 2016
    Database Testing: Oracle 12c, version 12.1.0.2

*/
set linesize 1000;
-- create a table to hold the test timings
drop table test_timings purge
/
create table test_timings
(
  test_group_id     number not null,
  test_description  varchar2(50) not null,
  test_id           number not null,
  test_iteration    number not null,
  start_time        timestamp not null,
  end_time          timestamp not null,
  constraint test_timings_pk primary key (test_id, test_group_id, test_iteration)
)
/
drop sequence test_timings_seq
/
create sequence test_timings_seq start with 1 increment by 1 nocycle
/
drop sequence test_group_seq
/
create sequence test_group_seq start with 1 increment by 1 nocycle
/
-- create the customer profile: standard heap table with primary key index.
create table customer_profile
(
  customer_id       number not null,
  gender            varchar2(1),
  age               number,
  segment_id        number,
  profile_1         number,
  profile_2         number,
  profile_3         number,
  profile_4         number,
  profile_5         number,
  profile_6         number,
  constraint cutomer_profile_pk primary key (customer_id)
)
/
create or replace procedure customer_profile_insert(number_of_rows in number, number_of_repeats in number default 1)
as
  num_rows number := 0;
begin
  execute immediate 'truncate table customer_profile';
  
  for indx in 1 .. customer_profile_insert.number_of_repeats loop
    
    insert /*+ append */ into customer_profile
      (customer_id, gender, age, segment_id, profile_1, profile_2, profile_3, profile_4, profile_5, profile_6)
    with v1 as (select rownum n from dual connect by level <= customer_profile_insert.number_of_rows) -- create 2m rows
    select (customer_profile_insert.number_of_rows * (indx-1)) + n customer_id, 
           case when round(sys.dbms_random.value(0,1)) = 0 then 'F' else 'M' end gender,
           round(sys.dbms_random.value(16,80)) age,
           round(sys.dbms_random.value(1,5)) segment_id,
           round(sys.dbms_random.value(1,25)) profile_1,
           round(sys.dbms_random.value(1,25)) profile_2,
           round(sys.dbms_random.value(1,25)) profile_3,
           round(sys.dbms_random.value(1,25)) profile_4,
           round(sys.dbms_random.value(1,25)) profile_5,
           round(sys.dbms_random.value(1,25)) profile_6
      from v1
      order by profile_6; -- randomise the customer_id order
    commit;
    
  end loop;
  
  dbms_stats.gather_table_Stats(ownname => USER, tabname => 'CUSTOMER_PROFILE', estimate_percent => 5);
  
  select count(*)
    into num_rows
    from customer_profile;

  dbms_output.put_line(num_rows || ' rows inserted into customer_profile.');
  exception
    when others then
      dbms_output.put_line(dbms_utility.format_error_stack());
      dbms_output.put_line(dbms_utility.format_error_backtrace());
      dbms_output.put_line(dbms_utility.format_call_stack());
      raise;
end customer_profile_insert;
/
create or replace procedure customer_profile_select(number_of_iteractions in number, test_group in number, test_description in varchar2) 
as
  customer_profile_record customer_profile%rowtype;
  max_customer_id customer_profile.customer_id%type;
  lookup_customer_id customer_profile.customer_id%type;
  test_seq_number test_timings.test_id%type := test_timings_seq.nextval;
  start_time      test_timings.start_time%type;
  end_time        test_timings.end_time%type;
begin

  -- get the highest customer_id so we can set the upper limit in the query.
  -- we know th lowest value is 1.
  select max(customer_id)
    into customer_profile_select.max_customer_id
    from customer_profile;
       
  for indx in 1 .. customer_profile_select.number_of_iteractions loop
    
    customer_profile_select.lookup_customer_id := round(sys.dbms_random.value(1, customer_profile_select.max_customer_id));
    
    start_time := systimestamp;
    
    select *
      into customer_profile_record
      from customer_profile
      where customer_id = customer_profile_select.lookup_customer_id;
    
    end_time := systimestamp;
    
    insert into test_timings
      (test_id, test_description, 
       test_group_id, test_iteration, start_time, 
       end_time)
    values
      (customer_profile_select.test_seq_number, customer_profile_select.test_description, 
       customer_profile_select.test_group, indx, customer_profile_select.start_time, 
       customer_profile_select.end_time);
  
  end loop; 

  commit;
  
  exception
    when others then
      dbms_output.put_line(dbms_utility.format_error_stack());
      dbms_output.put_line(dbms_utility.format_error_backtrace());
      dbms_output.put_line(dbms_utility.format_call_stack());
      raise;
end customer_profile_select;
/
create or replace procedure customer_profile_select_group(number_of_iteractions in number, test_description in varchar2) 
as
  test_group number := test_group_seq.nextval;
begin
  for indx in 1 .. 20 loop
    customer_profile_select(customer_profile_select_group.number_of_iteractions,
                            customer_profile_select_group.test_group, 
                            customer_profile_select_group.test_description);
  end loop;
  exception
    when others then
      dbms_output.put_line(dbms_utility.format_error_stack());
      dbms_output.put_line(dbms_utility.format_error_backtrace());
      dbms_output.put_line(dbms_utility.format_call_stack());
      raise;
end customer_profile_select_group;
/

/*
-- Reset Everything
*/
set serveroutput on;
truncate table test_timings;
drop sequence test_timings_seq;
create sequence test_timings_seq start with 1 increment by 1 nocycle;
drop sequence test_group_seq;
create sequence test_group_seq start with 1 increment by 1 nocycle;
/*
-- Test 1, starndard heap table with standard index
*/
drop table customer_profile purge;
create table customer_profile
(
  customer_id       number not null,
  gender            varchar2(1),
  age               number,
  segment_id        number,
  profile_1         number,
  profile_2         number,
  profile_3         number,
  profile_4         number,
  profile_5         number,
  profile_6         number,
  constraint cutomer_profile_pk primary key (customer_id)
)
/
alter procedure customer_profile_insert compile;
alter procedure customer_profile_select compile;
alter procedure customer_profile_select_group compile;
exec customer_profile_insert(1000000,2)
/
exec customer_profile_select_group(200,'1. Heap Table+Index')
/
/*
-- Test 2, starndard heap table with partitions by hash
*/
drop table customer_profile purge;
create table customer_profile
(
  customer_id       number not null,
  gender            varchar2(1),
  age               number,
  segment_id        number,
  profile_1         number,
  profile_2         number,
  profile_3         number,
  profile_4         number,
  profile_5         number,
  profile_6         number,
  constraint cutomer_profile_pk primary key (customer_id)
)
partition by hash (customer_id)
partitions 4;
/
alter procedure customer_profile_insert compile;
alter procedure customer_profile_select compile;
alter procedure customer_profile_select_group compile;
set serveroutput on;
exec customer_profile_insert(1000000,2)
/
exec customer_profile_select_group(200,'2. Heap Table+Index+Partition')
/
/*
-- Test 3, index organised table
*/
drop table customer_profile purge;
create table customer_profile
(
  customer_id       number not null,
  gender            varchar2(1),
  age               number,
  segment_id        number,
  profile_1         number,
  profile_2         number,
  profile_3         number,
  profile_4         number,
  profile_5         number,
  profile_6         number,
  constraint cutomer_profile_pk primary key (customer_id)
)
organization index;
/
alter procedure customer_profile_insert compile;
alter procedure customer_profile_select compile;
alter procedure customer_profile_select_group compile;
set serveroutput on;
exec customer_profile_insert(1000000,2)
/
exec customer_profile_select_group(200,'3. IOT Table+Index')
/
/*
-- Test 4, index organised table hash partitioned
*/
drop table customer_profile purge;
create table customer_profile
(
  customer_id       number not null,
  gender            varchar2(1),
  age               number,
  segment_id        number,
  profile_1         number,
  profile_2         number,
  profile_3         number,
  profile_4         number,
  profile_5         number,
  profile_6         number,
  constraint cutomer_profile_pk primary key (customer_id)
)
organization index
partition by hash (customer_id)
partitions 4;
/
alter procedure customer_profile_insert compile;
alter procedure customer_profile_select compile;
alter procedure customer_profile_select_group compile;
set serveroutput on;
exec customer_profile_insert(1000000,2)
/
exec customer_profile_select_group(200,'4. IOT Table+Index+Partition')
/
/*
-- Test 5, starndard heap table with standard index - InMemory
*/
drop table customer_profile purge;
create table customer_profile
(
  customer_id       number not null,
  gender            varchar2(1),
  age               number,
  segment_id        number,
  profile_1         number,
  profile_2         number,
  profile_3         number,
  profile_4         number,
  profile_5         number,
  profile_6         number,
  constraint cutomer_profile_pk primary key (customer_id)
)
inmemory
/
alter procedure customer_profile_insert compile;
alter procedure customer_profile_select compile;
alter procedure customer_profile_select_group compile;
exec customer_profile_insert(1000000,2)
/
exec customer_profile_select_group(200,'5. InMemory+Heap Table+Index')
/
/*
-- Test 6, starndard heap table with partitions by hash - InMemory
*/
drop table customer_profile purge;
create table customer_profile
(
  customer_id       number not null,
  gender            varchar2(1),
  age               number,
  segment_id        number,
  profile_1         number,
  profile_2         number,
  profile_3         number,
  profile_4         number,
  profile_5         number,
  profile_6         number,
  constraint cutomer_profile_pk primary key (customer_id)
)
partition by hash (customer_id)
partitions 4
inmemory;
/
alter procedure customer_profile_insert compile;
alter procedure customer_profile_select compile;
alter procedure customer_profile_select_group compile;
set serveroutput on;
exec customer_profile_insert(1000000,2)
/
exec customer_profile_select_group(200,'6. InMemoryHeap Table+Index+Partition')
/
/*
------------------------------------------------------------------------------------------
*/
select test_group_id, test_description, 
       sum(elapsed_seconds) sum_secs, avg(elapsed_seconds) avg_secs, 
       max(elapsed_seconds) max_secs, min(elapsed_seconds) min_secs
  from (
        select test_group_id, test_description, test_id, sum(extract(second from end_time-start_time)) elapsed_seconds 
          from test_timings
         group by test_group_id, test_id, test_description
       )
 group by test_group_id, test_description
 order by test_group_id, test_description;
/
select count(*) num_tests from test_timings
/


Restricting Number of Rows Returned in a Table Analysis (OBIEE)

When recently reviewing a Dashboard showing the "Top 10", was it just by luck on my first visit to the page it showed a Top 10?
Surely it can't be difficult to just get the Top 10?  Depending on the filters set sometimes it was a Top 10, sometimes a Top 7, sometimes a Top 13, so was it just by luck I got the Top 10 the first time round?  I looked into the analysis, the RANK function had been used to ordering the results in Ascending order for the column the measure was about, this was fine, rows were ranked 1,2,3...  so why did I get too many rows?
The RANK function will give a sequential number as values increase, e.g.
RANK   VALUE
----   -----
1      10
2      11
3      15
3      15
... however, once a value is repeated the rank is set to the lowest value for that set.  Hence, in this example, if the filter was set to RANK(VALUE) <= 3, four rows would be returned.  What are the alternatives?  RSUM, RCOUNT (running aggregates)?  Working through an example, first I created an answer with Product Description, Revenue, Rank, RSUM and RCOUNT.

[Edit - not shown here, but TOPN behaves in a similar way to RSUM and RCOUNT]
Rank Formula:


To manufacture the results, I have used the CAST function to change the values to integers therefore duplicating the results.
RSUM Formula:

This in theory would create a sequential number.
RCOUNT Formula:

Again, I would expect this to result in a sequential number.
The results:

In the results, at Rank 10 there are three products with the same revenue (when rounded to zero decimal places).  As a result, rows 10, 11, 12 all get a Rank of 10.  Surprisingly, they all get the same RSUM value of 10.  RCOUNT continues sequentially.  Note, this is a simplistic set of data, for the data that prompted me to write this post I wasn't seeing sequential values for RCOUNT.  If you explore the SQL issued to the Database (using Oracle 11.2.0.4 at least), SQL isn't issued to derive the RSUM and RCOUNT columns, therefore these are being built on the BI Server and not necessarily in the order of data returned by the Database.
How do we solve this?
If you simply want the result in a table, it's simple - change the number of rows to be displayed by editing the table properties.
The default is to usually show the Paging Controls below the results:


Switch this to "Hide" and change the number of rows to display:


The Results: 


So we now get 10 rows returned.
The Limitations:
So, yes, this works for a table, but not on a chart based on the same results:

As you see in the chart, the full list or products appears.
Pivot Table:
The solution however is relatively simple, by creating a Pivot Table then set the page size to the number of desired rows as shown above, then, within the Pivot Table Edit View, create a chart on the Pivot Table:


Problem Solved.

OBIEE URL's Explained

OBIEE Dashboards obviously provide a graphical interface to navigate data and offer drill down through tables and charts and action links to direct the user to another dashboard or application or website.
Sometimes you need to use the dashboard URL to guide the user to another dashboard page, and once you understand the construction it's quite simple to use in your answers or action links.
External URL
http://[server name]:[port]/analytics/saw.dll?Dashboard
Default port is 9704, however your administrator may have set an alternative.
From the Web Server
/analytics/saw.dll?Dashboard
From the Presentation Services
saw.dll?Dashboard
Very simple so far, but this only gets us to the home page, lets look at navigating to a specific Dashboard Page.  Navigating to a page uses parameters in the URL where parameters are prefixed with an &.
Portal Path - Catalogue path to the dashboard
&PortalPath=dashboard_path
Page - Name of the Dashboard Page
&Page=page_name
Note: Any spaces in the dashboard name should be replaced with a %20, for example "My Page" become "My%20Page".
You may want to set the filters or prompts on the target page, you add more parameters to enable this.
Filters
&Action=Navigate
&P0=n - Where n is the number of filters to be applied.
&P1=operator - Where operator is eq, lt etc. see the list below.
&P2=column name - The column to be filtered
&P3=filter value - The filter value for the column in P2

Explain plans explained - Oracle

When you submit a SQL Query to Oracle the optimizer will first work out the most efficient way of running the query. The optimizer will look at the query, look at stats on the database objects then work out many ways in which the query could be run i.e. what order to access the objects. For each possible method a score is estimated (the optimizer cost) and the path, or plan with the lost score/cost is chosen.


To see the chosen method of access the objects, we can view the query plan/path known as the Explain Plan. This is very useful in understanding why a query is perhaps taking longer than we might have expected and can be used to trap sub-optimal joins that are perhaps causing a cartesian join.


A real good link for a good introduction into Explain Plans.

Using Action Links in OBIEE 11g

Oracle's OBIEE 11g now has the option to create an action link from an analysis, dashboard, KPI etc. which basically means you can add a link to direct the user elsewhere, within the OBI catalogue or to an external system or web page.  The beauty is in the magic, where you can set an action link against a column value and if the value is shown in a table or a chart, the table or chart will trigger the action link.
Further to directing a user to another dashboard page, you can use the action links to set value in Dashboard Prompts, improving the users experience as the prompts have effectively been set for them.  This blog post will demonstrate how these links can be set up.
Step 1
Build you criteria, and when you are happy with the results, you are ready to start adding an Action Link.

Step 2
From the criteria, select the drop down on the column, in this example "Total Sales Income".  Select "Column Properties".

Step 3
Navigate to the "Interaction" tab, you will notice you can add interactions to the column heading and to the value.  In this example we concentrate on the Value, however, the heading works in just the same way.
Click the + sign to add the action link.

Step 4
Enter some text in the "Link text", this will popup whenever the user clicks on the column value whether the presentation is in a table or chart.

Step 5
Choose the type of action link.  In this example we will choose "Navigate to a Web Page" as we are going to navigate to a page within our dashboards.

Step 6
Paste the URL in the "URL" box, including parameter settings if you have some i.e. values from prompts on the target page.  If you do have parameters to set, click the button "Define Parameters"


Step 7
If your URL has the parameter values set, you will notice OBIEE has broken the parameters up for you so you can tweak these as you see fit.  In this example, we will set the parameter for the "Region" dimension to be driven by a column value selected on the source page i.e. page hosting the answer with the action link.

Step 8
Select the button next to the dimension to be changed.

Step 9
Choose "Column Value"

Step 10
Select the column that contains the value you want to use.

Step 11
We are now set up, however the default options don't set the action link to be hidden, therefore when the use clicks it a box will appear with the value settings.  This is fine for testing, but in my opinion not a great user experience in these circumstances.
I set all of the parameters to hidden, therefore avoiding the action link confirmation box appearing when it's used.

There will be another blog post on how to set the URL and the parameter values.
Hope this has helped you.