ODTUG Aggregator ODTUG Blogs http://localhost:8080 Sat, 19 Aug 2017 16:26:26 +0000 http://aggrssgator.com/ OAC - Import Essbase Cloud to OAC-BI Admin Tool http://beyond-just-data.blogspot.com/2017/08/oac-import-essbase-cloud-to-oac-bi.html <div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;">I finally have time to play with Oracle Analytics Cloud Enterprise Edition (OAC EE). &nbsp;</span></span></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;">I have been interested in setting up my OAC BI instance to work off an Essbase Cloud cube.</span></span></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><br /></span></span></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><br /></span></span></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;">So; First Things First...Download the latest version of the OBIEE Admin Tool that supports BICloud</span></span></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;">http://www.oracle.com/technetwork/middleware/bicloud/downloads/index.html</span></span></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;">At the time of this post it was 12.2.2.0.20</span></span></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;">Keep an eye out for updates...</span></span></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span></div><div class="separator" style="clear: both; text-align: center;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><a href="https://4.bp.blogspot.com/-DLM8dNQuBUA/WZUeFQsAvdI/AAAAAAAAK9k/4P4rHQ7Dmv0Y04W3xB-EECx9VckWCLn_ACLcBGAs/s1600/00.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="461" data-original-width="1568" height="117" src="https://4.bp.blogspot.com/-DLM8dNQuBUA/WZUeFQsAvdI/AAAAAAAAK9k/4P4rHQ7Dmv0Y04W3xB-EECx9VckWCLn_ACLcBGAs/s400/00.png" width="400" /></a></span></span></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"></span></span></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;">I have an Oracle Analytics Cloud Enterprise Edition instance that allows me to work with an RPD for Data Modeling instead of the built in data modeler that we know from BICS. &nbsp;The BICS Data Modeler does not let me connect to Essbase Cloud Service.&nbsp; I also have an </span></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;">Oracle Analytics Cloud Essbase Cloud Service.&nbsp; I need to make my Essbase Cloud Cubes available as subject areas in my OAC-BI instance.</span></span></div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;">The cube I will start with is the Sample.Basic from my Essbsae Cloud server.</span></span></div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://2.bp.blogspot.com/-Y7i_w25XJes/WZcQ_NFoOTI/AAAAAAAAK-Q/K4bUBxo7nYot9o_4v8S73SMJoVW2XX6JwCLcBGAs/s1600/2017-08-18_10-58-03.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="375" data-original-width="1002" height="148" src="https://2.bp.blogspot.com/-Y7i_w25XJes/WZcQ_NFoOTI/AAAAAAAAK-Q/K4bUBxo7nYot9o_4v8S73SMJoVW2XX6JwCLcBGAs/s400/2017-08-18_10-58-03.png" width="400" /></a></div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;">So starting up the Admin Tool I need to create a new RPD. &nbsp;I selected 'No' for the Import Metadata because I like to create my Physical Layer one piece at a time.&nbsp;</span></span></div><div class="separator" style="clear: both; text-align: center;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span></div><div class="separator" style="clear: both; text-align: center;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><a href="https://1.bp.blogspot.com/-BZUzakY4Rp0/WZUZWdP5lfI/AAAAAAAAK8Y/mQJkUc5tbhQZxt5pGf6mA0IcRjMJwVYBwCLcBGAs/s1600/01.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" data-original-height="1048" data-original-width="1600" height="261" src="https://1.bp.blogspot.com/-BZUzakY4Rp0/WZUZWdP5lfI/AAAAAAAAK8Y/mQJkUc5tbhQZxt5pGf6mA0IcRjMJwVYBwCLcBGAs/s400/01.png" width="400" /></span></a></span></span></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;">In the Physical Layer I created a New Database...</span><span style="font-size: x-small;"></span></span><br /><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span><br /><div class="separator" style="clear: both; text-align: center;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><a href="https://4.bp.blogspot.com/-dn4x4hUPPN8/WZUZdJosdUI/AAAAAAAAK9c/5s171Qz0-SkNtBuxmD01laI2RFV3B7NAQCEwYBhgL/s1600/02.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" data-original-height="376" data-original-width="706" height="212" src="https://4.bp.blogspot.com/-dn4x4hUPPN8/WZUZdJosdUI/AAAAAAAAK9c/5s171Qz0-SkNtBuxmD01laI2RFV3B7NAQCEwYBhgL/s400/02.png" width="400" /></span></a></span></span></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;">Since I am connecting to an Essbase Cloud instance I selected the latest version of Essbase that is available in the list of Database Types,</span></span><br /><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span><br /><div class="separator" style="clear: both; text-align: center;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><a href="https://2.bp.blogspot.com/-QsnPG1uL39U/WZUZWSt7j-I/AAAAAAAAK9c/IoUXoSiuVi8Nb2FeCOt3sSJZWd3YqsvywCEwYBhgL/s1600/03.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" data-original-height="968" data-original-width="822" height="400" src="https://2.bp.blogspot.com/-QsnPG1uL39U/WZUZWSt7j-I/AAAAAAAAK9c/IoUXoSiuVi8Nb2FeCOt3sSJZWd3YqsvywCEwYBhgL/s400/03.png" width="338" /></span></a></span></span></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span><br /><!--[if gte mso 9]><xml> <o:OfficeDocumentSettings> <o:RelyOnVML/> <o:AllowPNG/> </o:OfficeDocumentSettings></xml><![endif]--><br /><!--[if gte mso 9]><xml> <w:WordDocument> <w:View>Normal</w:View> <w:Zoom>0</w:Zoom> <w:TrackMoves/> <w:TrackFormatting/> <w:PunctuationKerning/> <w:ValidateAgainstSchemas/> <w:SaveIfXMLInvalid>false</w:SaveIfXMLInvalid> <w:IgnoreMixedContent>false</w:IgnoreMixedContent> <w:AlwaysShowPlaceholderText>false</w:AlwaysShowPlaceholderText> <w:DoNotPromoteQF/> <w:LidThemeOther>EN-US</w:LidThemeOther> <w:LidThemeAsian>X-NONE</w:LidThemeAsian> <w:LidThemeComplexScript>X-NONE</w:LidThemeComplexScript> <w:Compatibility> <w:BreakWrappedTables/> <w:SnapToGridInCell/> <w:WrapTextWithPunct/> <w:UseAsianBreakRules/> <w:DontGrowAutofit/> <w:SplitPgBreakAndParaMark/> <w:EnableOpenTypeKerning/> <w:DontFlipMirrorIndents/> <w:OverrideTableStyleHps/> </w:Compatibility> <m:mathPr> <m:mathFont m:val="Cambria Math"/> <m:brkBin m:val="before"/> <m:brkBinSub m:val="&#45;-"/> <m:smallFrac m:val="off"/> <m:dispDef/> <m:lMargin m:val="0"/> <m:rMargin m:val="0"/> <m:defJc m:val="centerGroup"/> <m:wrapIndent m:val="1440"/> <m:intLim m:val="subSup"/> <m:naryLim m:val="undOvr"/> </m:mathPr></w:WordDocument></xml><![endif]--><!--[if gte mso 9]><xml> <w:LatentStyles DefLockedState="false" DefUnhideWhenUsed="false" DefSemiHidden="false" DefQFormat="false" DefPriority="99" LatentStyleCount="371"> <w:LsdException Locked="false" Priority="0" QFormat="true" Name="Normal"/> <w:LsdException Locked="false" Priority="9" QFormat="true" Name="heading 1"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 2"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 3"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 4"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 5"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 6"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 7"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 8"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 9"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 6"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 7"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 8"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 9"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 1"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 2"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 3"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 4"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 5"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 6"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 7"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 8"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 9"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Normal Indent"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="footnote text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="annotation text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="header"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="footer"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index heading"/> <w:LsdException Locked="false" Priority="35" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="caption"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="table of figures"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="envelope address"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="envelope return"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="footnote reference"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="annotation reference"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="line number"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="page number"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="endnote reference"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="endnote text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="table of authorities"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="macro"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="toa heading"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number 5"/> <w:LsdException Locked="false" Priority="10" QFormat="true" Name="Title"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Closing"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Signature"/> <w:LsdException Locked="false" Priority="1" SemiHidden="true" UnhideWhenUsed="true" Name="Default Paragraph Font"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text Indent"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Message Header"/> <w:LsdException Locked="false" Priority="11" QFormat="true" Name="Subtitle"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Salutation"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Date"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text First Indent"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text First Indent 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Note Heading"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text Indent 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text Indent 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Block Text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Hyperlink"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="FollowedHyperlink"/> <w:LsdException Locked="false" Priority="22" QFormat="true" Name="Strong"/> <w:LsdException Locked="false" Priority="20" QFormat="true" Name="Emphasis"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Document Map"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Plain Text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="E-mail Signature"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Top of Form"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Bottom of Form"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Normal (Web)"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Acronym"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Address"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Cite"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Code"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Definition"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Keyboard"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Preformatted"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Sample"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Typewriter"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Variable"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Normal Table"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="annotation subject"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="No List"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Outline List 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Outline List 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Outline List 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Simple 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Simple 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Simple 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Classic 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Classic 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Classic 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Classic 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Colorful 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Colorful 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Colorful 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 6"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 7"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 8"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 6"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 7"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 8"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table 3D effects 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table 3D effects 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table 3D effects 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Contemporary"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Elegant"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Professional"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Subtle 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Subtle 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Web 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Web 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Web 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Balloon Text"/> <w:LsdException Locked="false" Priority="39" Name="Table Grid"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Theme"/> <w:LsdException Locked="false" SemiHidden="true" Name="Placeholder Text"/> <w:LsdException Locked="false" Priority="1" QFormat="true" Name="No Spacing"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading"/> <w:LsdException Locked="false" Priority="61" Name="Light List"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3"/> <w:LsdException Locked="false" Priority="70" Name="Dark List"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 1"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 1"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 1"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 1"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 1"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 1"/> <w:LsdException Locked="false" SemiHidden="true" Name="Revision"/> <w:LsdException Locked="false" Priority="34" QFormat="true" Name="List Paragraph"/> <w:LsdException Locked="false" Priority="29" QFormat="true" Name="Quote"/> <w:LsdException Locked="false" Priority="30" QFormat="true" Name="Intense Quote"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 1"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 1"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 1"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 1"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 1"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 1"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 1"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 1"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 2"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 2"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 2"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 2"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 2"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 2"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 2"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 2"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 2"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 2"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 2"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 2"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 2"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 2"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 3"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 3"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 3"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 3"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 3"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 3"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 3"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 3"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 3"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 3"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 3"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 3"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 3"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 3"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 4"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 4"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 4"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 4"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 4"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 4"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 4"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 4"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 4"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 4"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 4"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 4"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 4"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 4"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 5"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 5"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 5"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 5"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 5"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 5"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 5"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 5"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 5"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 5"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 5"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 5"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 5"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 5"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 6"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 6"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 6"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 6"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 6"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 6"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 6"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 6"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 6"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 6"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 6"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 6"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 6"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 6"/> <w:LsdException Locked="false" Priority="19" QFormat="true" Name="Subtle Emphasis"/> <w:LsdException Locked="false" Priority="21" QFormat="true" Name="Intense Emphasis"/> <w:LsdException Locked="false" Priority="31" QFormat="true" Name="Subtle Reference"/> <w:LsdException Locked="false" Priority="32" QFormat="true" Name="Intense Reference"/> <w:LsdException Locked="false" Priority="33" QFormat="true" Name="Book Title"/> <w:LsdException Locked="false" Priority="37" SemiHidden="true" UnhideWhenUsed="true" Name="Bibliography"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="TOC Heading"/> <w:LsdException Locked="false" Priority="41" Name="Plain Table 1"/> <w:LsdException Locked="false" Priority="42" Name="Plain Table 2"/> <w:LsdException Locked="false" Priority="43" Name="Plain Table 3"/> <w:LsdException Locked="false" Priority="44" Name="Plain Table 4"/> <w:LsdException Locked="false" Priority="45" Name="Plain Table 5"/> <w:LsdException Locked="false" Priority="40" Name="Grid Table Light"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 1"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 1"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 1"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 1"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 1"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 1"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 1"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 2"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 2"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 2"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 2"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 2"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 2"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 2"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 3"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 3"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 3"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 3"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 3"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 3"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 3"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 4"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 4"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 4"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 4"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 4"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 4"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 4"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 5"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 5"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 5"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 5"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 5"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 5"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 5"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 6"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 6"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 6"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 6"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 6"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 6"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 6"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 1"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 1"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 1"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 1"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 1"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 1"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 1"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 2"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 2"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 2"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 2"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 2"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 2"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 2"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 3"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 3"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 3"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 3"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 3"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 3"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 3"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 4"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 4"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 4"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 4"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 4"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 4"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 4"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 5"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 5"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 5"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 5"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 5"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 5"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 5"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 6"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 6"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 6"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 6"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 6"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 6"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 6"/> </w:LatentStyles></xml><![endif]--><!--[if gte mso 10]><style> /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:8.0pt; mso-para-margin-left:0in; line-height:107%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri",sans-serif; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} </style><![endif]--> <br /><div style="margin: 0in;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="color: black; font-size: x-small;">Next I created the connection pool.<span style="mso-spacerun: yes;">&nbsp; </span>I put the Public IP address for my Essbase Cloud Instance</span></span></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><br /></span></span><br /><div class="separator" style="clear: both; text-align: center;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><a href="https://4.bp.blogspot.com/-QajS_FQrB1Q/WZUZW1bA8SI/AAAAAAAAK9c/3VQXywDAazkD7cuUkfUcNxAk8mDZT6mjwCEwYBhgL/s1600/04.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" data-original-height="1138" data-original-width="948" height="400" src="https://4.bp.blogspot.com/-QajS_FQrB1Q/WZUZW1bA8SI/AAAAAAAAK9c/3VQXywDAazkD7cuUkfUcNxAk8mDZT6mjwCEwYBhgL/s400/04.png" width="332" /></span></a></span></span></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><br /></span></span><!--[if gte mso 9]><xml> <o:OfficeDocumentSettings> <o:RelyOnVML/> <o:AllowPNG/> </o:OfficeDocumentSettings></xml><![endif]--><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"></span></span><!--[if gte mso 9]><xml> <w:WordDocument> <w:View>Normal</w:View> <w:Zoom>0</w:Zoom> <w:TrackMoves/> <w:TrackFormatting/> <w:PunctuationKerning/> <w:ValidateAgainstSchemas/> <w:SaveIfXMLInvalid>false</w:SaveIfXMLInvalid> <w:IgnoreMixedContent>false</w:IgnoreMixedContent> <w:AlwaysShowPlaceholderText>false</w:AlwaysShowPlaceholderText> <w:DoNotPromoteQF/> <w:LidThemeOther>EN-US</w:LidThemeOther> <w:LidThemeAsian>X-NONE</w:LidThemeAsian> <w:LidThemeComplexScript>X-NONE</w:LidThemeComplexScript> <w:Compatibility> <w:BreakWrappedTables/> <w:SnapToGridInCell/> <w:WrapTextWithPunct/> <w:UseAsianBreakRules/> <w:DontGrowAutofit/> <w:SplitPgBreakAndParaMark/> <w:EnableOpenTypeKerning/> <w:DontFlipMirrorIndents/> <w:OverrideTableStyleHps/> </w:Compatibility> <m:mathPr> <m:mathFont m:val="Cambria Math"/> <m:brkBin m:val="before"/> <m:brkBinSub m:val="&#45;-"/> <m:smallFrac m:val="off"/> <m:dispDef/> <m:lMargin m:val="0"/> <m:rMargin m:val="0"/> <m:defJc m:val="centerGroup"/> <m:wrapIndent m:val="1440"/> <m:intLim m:val="subSup"/> <m:naryLim m:val="undOvr"/> </m:mathPr></w:WordDocument></xml><![endif]--><!--[if gte mso 9]><xml> <w:LatentStyles DefLockedState="false" DefUnhideWhenUsed="false" DefSemiHidden="false" DefQFormat="false" DefPriority="99" LatentStyleCount="371"> <w:LsdException Locked="false" Priority="0" QFormat="true" Name="Normal"/> <w:LsdException Locked="false" Priority="9" QFormat="true" Name="heading 1"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 2"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 3"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 4"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 5"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 6"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 7"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 8"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 9"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 6"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 7"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 8"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 9"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 1"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 2"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 3"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 4"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 5"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 6"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 7"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 8"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 9"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Normal Indent"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="footnote text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="annotation text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="header"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="footer"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index heading"/> <w:LsdException Locked="false" Priority="35" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="caption"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="table of figures"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="envelope address"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="envelope return"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="footnote reference"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="annotation reference"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="line number"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="page number"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="endnote reference"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="endnote text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="table of authorities"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="macro"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="toa heading"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number 5"/> <w:LsdException Locked="false" Priority="10" QFormat="true" Name="Title"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Closing"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Signature"/> <w:LsdException Locked="false" Priority="1" SemiHidden="true" UnhideWhenUsed="true" Name="Default Paragraph Font"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text Indent"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Message Header"/> <w:LsdException Locked="false" Priority="11" QFormat="true" Name="Subtitle"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Salutation"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Date"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text First Indent"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text First Indent 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Note Heading"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text Indent 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text Indent 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Block Text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Hyperlink"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="FollowedHyperlink"/> <w:LsdException Locked="false" Priority="22" QFormat="true" Name="Strong"/> <w:LsdException Locked="false" Priority="20" QFormat="true" Name="Emphasis"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Document Map"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Plain Text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="E-mail Signature"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Top of Form"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Bottom of Form"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Normal (Web)"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Acronym"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Address"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Cite"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Code"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Definition"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Keyboard"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Preformatted"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Sample"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Typewriter"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Variable"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Normal Table"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="annotation subject"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="No List"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Outline List 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Outline List 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Outline List 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Simple 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Simple 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Simple 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Classic 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Classic 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Classic 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Classic 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Colorful 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Colorful 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Colorful 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 6"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 7"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 8"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 6"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 7"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 8"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table 3D effects 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table 3D effects 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table 3D effects 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Contemporary"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Elegant"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Professional"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Subtle 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Subtle 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Web 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Web 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Web 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Balloon Text"/> <w:LsdException Locked="false" Priority="39" Name="Table Grid"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Theme"/> <w:LsdException Locked="false" SemiHidden="true" Name="Placeholder Text"/> <w:LsdException Locked="false" Priority="1" QFormat="true" Name="No Spacing"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading"/> <w:LsdException Locked="false" Priority="61" Name="Light List"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3"/> <w:LsdException Locked="false" Priority="70" Name="Dark List"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 1"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 1"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 1"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 1"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 1"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 1"/> <w:LsdException Locked="false" SemiHidden="true" Name="Revision"/> <w:LsdException Locked="false" Priority="34" QFormat="true" Name="List Paragraph"/> <w:LsdException Locked="false" Priority="29" QFormat="true" Name="Quote"/> <w:LsdException Locked="false" Priority="30" QFormat="true" Name="Intense Quote"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 1"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 1"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 1"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 1"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 1"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 1"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 1"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 1"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 2"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 2"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 2"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 2"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 2"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 2"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 2"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 2"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 2"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 2"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 2"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 2"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 2"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 2"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 3"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 3"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 3"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 3"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 3"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 3"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 3"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 3"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 3"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 3"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 3"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 3"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 3"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 3"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 4"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 4"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 4"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 4"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 4"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 4"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 4"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 4"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 4"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 4"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 4"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 4"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 4"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 4"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 5"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 5"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 5"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 5"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 5"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 5"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 5"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 5"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 5"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 5"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 5"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 5"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 5"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 5"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 6"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 6"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 6"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 6"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 6"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 6"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 6"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 6"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 6"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 6"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 6"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 6"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 6"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 6"/> <w:LsdException Locked="false" Priority="19" QFormat="true" Name="Subtle Emphasis"/> <w:LsdException Locked="false" Priority="21" QFormat="true" Name="Intense Emphasis"/> <w:LsdException Locked="false" Priority="31" QFormat="true" Name="Subtle Reference"/> <w:LsdException Locked="false" Priority="32" QFormat="true" Name="Intense Reference"/> <w:LsdException Locked="false" Priority="33" QFormat="true" Name="Book Title"/> <w:LsdException Locked="false" Priority="37" SemiHidden="true" UnhideWhenUsed="true" Name="Bibliography"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="TOC Heading"/> <w:LsdException Locked="false" Priority="41" Name="Plain Table 1"/> <w:LsdException Locked="false" Priority="42" Name="Plain Table 2"/> <w:LsdException Locked="false" Priority="43" Name="Plain Table 3"/> <w:LsdException Locked="false" Priority="44" Name="Plain Table 4"/> <w:LsdException Locked="false" Priority="45" Name="Plain Table 5"/> <w:LsdException Locked="false" Priority="40" Name="Grid Table Light"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 1"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 1"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 1"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 1"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 1"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 1"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 1"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 2"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 2"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 2"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 2"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 2"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 2"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 2"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 3"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 3"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 3"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 3"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 3"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 3"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 3"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 4"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 4"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 4"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 4"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 4"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 4"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 4"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 5"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 5"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 5"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 5"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 5"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 5"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 5"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 6"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 6"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 6"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 6"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 6"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 6"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 6"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 1"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 1"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 1"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 1"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 1"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 1"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 1"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 2"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 2"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 2"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 2"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 2"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 2"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 2"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 3"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 3"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 3"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 3"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 3"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 3"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 3"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 4"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 4"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 4"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 4"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 4"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 4"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 4"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 5"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 5"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 5"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 5"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 5"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 5"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 5"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 6"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 6"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 6"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 6"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 6"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 6"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 6"/> </w:LatentStyles></xml><![endif]--><!--[if gte mso 10]><style> /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:8.0pt; mso-para-margin-left:0in; line-height:107%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri",sans-serif; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} </style><![endif]--> <br /><div style="margin: 0in;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="color: black; font-size: x-small;">I import the Essbase metadata by right clicking on connection pool</span></span></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><br /></span></span><br /><div class="separator" style="clear: both; text-align: center;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><a href="https://1.bp.blogspot.com/-8JBB_amymBw/WZUZWro4chI/AAAAAAAAK9c/3DfvAUilCFMGT01qFL2NHK56qz0yIn_7wCEwYBhgL/s1600/05.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" data-original-height="413" data-original-width="692" height="237" src="https://1.bp.blogspot.com/-8JBB_amymBw/WZUZWro4chI/AAAAAAAAK9c/3DfvAUilCFMGT01qFL2NHK56qz0yIn_7wCEwYBhgL/s400/05.png" width="400" /></span></a></span></span></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"></span></span> <!--[if gte mso 9]><xml> <o:OfficeDocumentSettings> <o:RelyOnVML/> <o:AllowPNG/> </o:OfficeDocumentSettings></xml><![endif]--><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><br /></span></span><!--[if gte mso 9]><xml> <w:WordDocument> <w:View>Normal</w:View> <w:Zoom>0</w:Zoom> <w:TrackMoves/> <w:TrackFormatting/> <w:PunctuationKerning/> <w:ValidateAgainstSchemas/> <w:SaveIfXMLInvalid>false</w:SaveIfXMLInvalid> <w:IgnoreMixedContent>false</w:IgnoreMixedContent> <w:AlwaysShowPlaceholderText>false</w:AlwaysShowPlaceholderText> <w:DoNotPromoteQF/> <w:LidThemeOther>EN-US</w:LidThemeOther> <w:LidThemeAsian>X-NONE</w:LidThemeAsian> <w:LidThemeComplexScript>X-NONE</w:LidThemeComplexScript> <w:Compatibility> <w:BreakWrappedTables/> <w:SnapToGridInCell/> <w:WrapTextWithPunct/> <w:UseAsianBreakRules/> <w:DontGrowAutofit/> <w:SplitPgBreakAndParaMark/> <w:EnableOpenTypeKerning/> <w:DontFlipMirrorIndents/> <w:OverrideTableStyleHps/> </w:Compatibility> <m:mathPr> <m:mathFont m:val="Cambria Math"/> <m:brkBin m:val="before"/> <m:brkBinSub m:val="&#45;-"/> <m:smallFrac m:val="off"/> <m:dispDef/> <m:lMargin m:val="0"/> <m:rMargin m:val="0"/> <m:defJc m:val="centerGroup"/> <m:wrapIndent m:val="1440"/> <m:intLim m:val="subSup"/> <m:naryLim m:val="undOvr"/> </m:mathPr></w:WordDocument></xml><![endif]--><!--[if gte mso 9]><xml> <w:LatentStyles DefLockedState="false" DefUnhideWhenUsed="false" DefSemiHidden="false" DefQFormat="false" DefPriority="99" LatentStyleCount="371"> <w:LsdException Locked="false" Priority="0" QFormat="true" Name="Normal"/> <w:LsdException Locked="false" Priority="9" QFormat="true" Name="heading 1"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 2"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 3"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 4"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 5"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 6"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 7"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 8"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 9"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 6"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 7"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 8"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 9"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 1"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 2"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 3"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 4"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 5"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 6"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 7"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 8"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 9"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Normal Indent"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="footnote text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="annotation text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="header"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="footer"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index heading"/> <w:LsdException Locked="false" Priority="35" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="caption"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="table of figures"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="envelope address"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="envelope return"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="footnote reference"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="annotation reference"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="line number"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="page number"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="endnote reference"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="endnote text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="table of authorities"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="macro"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="toa heading"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number 5"/> <w:LsdException Locked="false" Priority="10" QFormat="true" Name="Title"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Closing"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Signature"/> <w:LsdException Locked="false" Priority="1" SemiHidden="true" UnhideWhenUsed="true" Name="Default Paragraph Font"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text Indent"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Message Header"/> <w:LsdException Locked="false" Priority="11" QFormat="true" Name="Subtitle"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Salutation"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Date"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text First Indent"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text First Indent 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Note Heading"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text Indent 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text Indent 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Block Text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Hyperlink"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="FollowedHyperlink"/> <w:LsdException Locked="false" Priority="22" QFormat="true" Name="Strong"/> <w:LsdException Locked="false" Priority="20" QFormat="true" Name="Emphasis"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Document Map"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Plain Text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="E-mail Signature"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Top of Form"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Bottom of Form"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Normal (Web)"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Acronym"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Address"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Cite"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Code"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Definition"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Keyboard"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Preformatted"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Sample"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Typewriter"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Variable"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Normal Table"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="annotation subject"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="No List"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Outline List 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Outline List 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Outline List 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Simple 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Simple 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Simple 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Classic 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Classic 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Classic 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Classic 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Colorful 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Colorful 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Colorful 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 6"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 7"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 8"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 6"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 7"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 8"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table 3D effects 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table 3D effects 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table 3D effects 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Contemporary"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Elegant"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Professional"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Subtle 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Subtle 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Web 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Web 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Web 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Balloon Text"/> <w:LsdException Locked="false" Priority="39" Name="Table Grid"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Theme"/> <w:LsdException Locked="false" SemiHidden="true" Name="Placeholder Text"/> <w:LsdException Locked="false" Priority="1" QFormat="true" Name="No Spacing"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading"/> <w:LsdException Locked="false" Priority="61" Name="Light List"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3"/> <w:LsdException Locked="false" Priority="70" Name="Dark List"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 1"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 1"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 1"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 1"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 1"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 1"/> <w:LsdException Locked="false" SemiHidden="true" Name="Revision"/> <w:LsdException Locked="false" Priority="34" QFormat="true" Name="List Paragraph"/> <w:LsdException Locked="false" Priority="29" QFormat="true" Name="Quote"/> <w:LsdException Locked="false" Priority="30" QFormat="true" Name="Intense Quote"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 1"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 1"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 1"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 1"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 1"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 1"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 1"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 1"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 2"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 2"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 2"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 2"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 2"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 2"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 2"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 2"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 2"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 2"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 2"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 2"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 2"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 2"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 3"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 3"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 3"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 3"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 3"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 3"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 3"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 3"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 3"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 3"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 3"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 3"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 3"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 3"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 4"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 4"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 4"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 4"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 4"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 4"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 4"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 4"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 4"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 4"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 4"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 4"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 4"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 4"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 5"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 5"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 5"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 5"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 5"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 5"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 5"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 5"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 5"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 5"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 5"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 5"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 5"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 5"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 6"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 6"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 6"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 6"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 6"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 6"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 6"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 6"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 6"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 6"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 6"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 6"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 6"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 6"/> <w:LsdException Locked="false" Priority="19" QFormat="true" Name="Subtle Emphasis"/> <w:LsdException Locked="false" Priority="21" QFormat="true" Name="Intense Emphasis"/> <w:LsdException Locked="false" Priority="31" QFormat="true" Name="Subtle Reference"/> <w:LsdException Locked="false" Priority="32" QFormat="true" Name="Intense Reference"/> <w:LsdException Locked="false" Priority="33" QFormat="true" Name="Book Title"/> <w:LsdException Locked="false" Priority="37" SemiHidden="true" UnhideWhenUsed="true" Name="Bibliography"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="TOC Heading"/> <w:LsdException Locked="false" Priority="41" Name="Plain Table 1"/> <w:LsdException Locked="false" Priority="42" Name="Plain Table 2"/> <w:LsdException Locked="false" Priority="43" Name="Plain Table 3"/> <w:LsdException Locked="false" Priority="44" Name="Plain Table 4"/> <w:LsdException Locked="false" Priority="45" Name="Plain Table 5"/> <w:LsdException Locked="false" Priority="40" Name="Grid Table Light"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 1"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 1"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 1"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 1"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 1"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 1"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 1"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 2"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 2"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 2"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 2"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 2"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 2"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 2"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 3"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 3"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 3"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 3"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 3"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 3"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 3"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 4"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 4"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 4"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 4"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 4"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 4"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 4"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 5"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 5"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 5"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 5"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 5"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 5"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 5"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 6"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 6"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 6"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 6"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 6"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 6"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 6"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 1"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 1"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 1"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 1"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 1"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 1"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 1"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 2"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 2"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 2"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 2"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 2"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 2"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 2"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 3"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 3"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 3"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 3"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 3"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 3"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 3"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 4"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 4"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 4"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 4"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 4"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 4"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 4"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 5"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 5"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 5"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 5"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 5"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 5"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 5"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 6"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 6"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 6"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 6"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 6"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 6"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 6"/> </w:LatentStyles></xml><![endif]--><!--[if gte mso 10]><style> /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:8.0pt; mso-para-margin-left:0in; line-height:107%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri",sans-serif; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} </style><![endif]--> <br /><div style="margin: 0in;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="color: black; font-size: x-small;">I authenticated to the Essbase Cloud instance and navigated to my Cube to import and started the Import.&nbsp; </span><span style="font-size: x-small;"><span style="color: black;">The process started and up popped a message.</span></span></span><!--[if gte mso 9]><xml> <o:OfficeDocumentSettings> <o:RelyOnVML/> <o:AllowPNG/> </o:OfficeDocumentSettings></xml><![endif]--><br /><!--[if gte mso 9]><xml> <w:WordDocument> <w:View>Normal</w:View> <w:Zoom>0</w:Zoom> <w:TrackMoves/> <w:TrackFormatting/> <w:PunctuationKerning/> <w:ValidateAgainstSchemas/> <w:SaveIfXMLInvalid>false</w:SaveIfXMLInvalid> <w:IgnoreMixedContent>false</w:IgnoreMixedContent> <w:AlwaysShowPlaceholderText>false</w:AlwaysShowPlaceholderText> <w:DoNotPromoteQF/> <w:LidThemeOther>EN-US</w:LidThemeOther> <w:LidThemeAsian>X-NONE</w:LidThemeAsian> <w:LidThemeComplexScript>X-NONE</w:LidThemeComplexScript> <w:Compatibility> <w:BreakWrappedTables/> <w:SnapToGridInCell/> <w:WrapTextWithPunct/> <w:UseAsianBreakRules/> <w:DontGrowAutofit/> <w:SplitPgBreakAndParaMark/> <w:EnableOpenTypeKerning/> <w:DontFlipMirrorIndents/> <w:OverrideTableStyleHps/> </w:Compatibility> <m:mathPr> <m:mathFont m:val="Cambria Math"/> <m:brkBin m:val="before"/> <m:brkBinSub m:val="&#45;-"/> <m:smallFrac m:val="off"/> <m:dispDef/> <m:lMargin m:val="0"/> <m:rMargin m:val="0"/> <m:defJc m:val="centerGroup"/> <m:wrapIndent m:val="1440"/> <m:intLim m:val="subSup"/> <m:naryLim m:val="undOvr"/> </m:mathPr></w:WordDocument></xml><![endif]--><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><span style="color: black;"></span></span></span> </div><div style="margin: 0in;"><div class="separator" style="clear: both; text-align: center;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><a href="https://1.bp.blogspot.com/-C5QeOldwhqY/WZb_t7bVUHI/AAAAAAAAK90/JIksQ7sXmIo33O2djfyHESIqpg-FrqkYQCLcBGAs/s1600/07.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1048" data-original-width="1600" height="261" src="https://1.bp.blogspot.com/-C5QeOldwhqY/WZb_t7bVUHI/AAAAAAAAK90/JIksQ7sXmIo33O2djfyHESIqpg-FrqkYQCLcBGAs/s400/07.png" width="400" /></a></span></span></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"></span></span></div><br /><!--[if gte mso 9]><xml> <o:OfficeDocumentSettings> <o:RelyOnVML/> <o:AllowPNG/> </o:OfficeDocumentSettings></xml><![endif]--><!--[if gte mso 9]><xml> <w:WordDocument> <w:View>Normal</w:View> <w:Zoom>0</w:Zoom> <w:TrackMoves/> <w:TrackFormatting/> <w:PunctuationKerning/> <w:ValidateAgainstSchemas/> <w:SaveIfXMLInvalid>false</w:SaveIfXMLInvalid> <w:IgnoreMixedContent>false</w:IgnoreMixedContent> <w:AlwaysShowPlaceholderText>false</w:AlwaysShowPlaceholderText> <w:DoNotPromoteQF/> <w:LidThemeOther>EN-US</w:LidThemeOther> <w:LidThemeAsian>X-NONE</w:LidThemeAsian> <w:LidThemeComplexScript>X-NONE</w:LidThemeComplexScript> <w:Compatibility> <w:BreakWrappedTables/> <w:SnapToGridInCell/> <w:WrapTextWithPunct/> <w:UseAsianBreakRules/> <w:DontGrowAutofit/> <w:SplitPgBreakAndParaMark/> <w:EnableOpenTypeKerning/> <w:DontFlipMirrorIndents/> <w:OverrideTableStyleHps/> </w:Compatibility> <m:mathPr> <m:mathFont m:val="Cambria Math"/> <m:brkBin m:val="before"/> <m:brkBinSub m:val="&#45;-"/> <m:smallFrac m:val="off"/> <m:dispDef/> <m:lMargin m:val="0"/> <m:rMargin m:val="0"/> <m:defJc m:val="centerGroup"/> <m:wrapIndent m:val="1440"/> <m:intLim m:val="subSup"/> <m:naryLim m:val="undOvr"/> </m:mathPr></w:WordDocument></xml><![endif]--><!--[if gte mso 9]><xml> <w:LatentStyles DefLockedState="false" DefUnhideWhenUsed="false" DefSemiHidden="false" DefQFormat="false" DefPriority="99" LatentStyleCount="371"> <w:LsdException Locked="false" Priority="0" QFormat="true" Name="Normal"/> <w:LsdException Locked="false" Priority="9" QFormat="true" Name="heading 1"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 2"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 3"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 4"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 5"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 6"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 7"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 8"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 9"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 6"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 7"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 8"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 9"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 1"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 2"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 3"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 4"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 5"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 6"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 7"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 8"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 9"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Normal Indent"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="footnote text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="annotation text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="header"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="footer"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index heading"/> <w:LsdException Locked="false" Priority="35" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="caption"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="table of figures"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="envelope address"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="envelope return"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="footnote reference"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="annotation reference"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="line number"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="page number"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="endnote reference"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="endnote text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="table of authorities"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="macro"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="toa heading"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number 5"/> <w:LsdException Locked="false" Priority="10" QFormat="true" Name="Title"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Closing"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Signature"/> <w:LsdException Locked="false" Priority="1" SemiHidden="true" UnhideWhenUsed="true" Name="Default Paragraph Font"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text Indent"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Message Header"/> <w:LsdException Locked="false" Priority="11" QFormat="true" Name="Subtitle"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Salutation"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Date"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text First Indent"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text First Indent 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Note Heading"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text Indent 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text Indent 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Block Text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Hyperlink"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="FollowedHyperlink"/> <w:LsdException Locked="false" Priority="22" QFormat="true" Name="Strong"/> <w:LsdException Locked="false" Priority="20" QFormat="true" Name="Emphasis"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Document Map"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Plain Text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="E-mail Signature"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Top of Form"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Bottom of Form"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Normal (Web)"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Acronym"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Address"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Cite"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Code"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Definition"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Keyboard"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Preformatted"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Sample"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Typewriter"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Variable"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Normal Table"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="annotation subject"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="No List"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Outline List 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Outline List 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Outline List 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Simple 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Simple 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Simple 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Classic 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Classic 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Classic 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Classic 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Colorful 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Colorful 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Colorful 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 6"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 7"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 8"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 6"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 7"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 8"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table 3D effects 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table 3D effects 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table 3D effects 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Contemporary"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Elegant"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Professional"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Subtle 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Subtle 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Web 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Web 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Web 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Balloon Text"/> <w:LsdException Locked="false" Priority="39" Name="Table Grid"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Theme"/> <w:LsdException Locked="false" SemiHidden="true" Name="Placeholder Text"/> <w:LsdException Locked="false" Priority="1" QFormat="true" Name="No Spacing"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading"/> <w:LsdException Locked="false" Priority="61" Name="Light List"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3"/> <w:LsdException Locked="false" Priority="70" Name="Dark List"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 1"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 1"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 1"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 1"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 1"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 1"/> <w:LsdException Locked="false" SemiHidden="true" Name="Revision"/> <w:LsdException Locked="false" Priority="34" QFormat="true" Name="List Paragraph"/> <w:LsdException Locked="false" Priority="29" QFormat="true" Name="Quote"/> <w:LsdException Locked="false" Priority="30" QFormat="true" Name="Intense Quote"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 1"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 1"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 1"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 1"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 1"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 1"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 1"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 1"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 2"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 2"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 2"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 2"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 2"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 2"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 2"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 2"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 2"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 2"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 2"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 2"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 2"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 2"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 3"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 3"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 3"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 3"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 3"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 3"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 3"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 3"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 3"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 3"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 3"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 3"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 3"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 3"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 4"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 4"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 4"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 4"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 4"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 4"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 4"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 4"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 4"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 4"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 4"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 4"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 4"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 4"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 5"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 5"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 5"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 5"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 5"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 5"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 5"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 5"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 5"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 5"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 5"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 5"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 5"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 5"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 6"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 6"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 6"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 6"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 6"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 6"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 6"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 6"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 6"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 6"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 6"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 6"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 6"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 6"/> <w:LsdException Locked="false" Priority="19" QFormat="true" Name="Subtle Emphasis"/> <w:LsdException Locked="false" Priority="21" QFormat="true" Name="Intense Emphasis"/> <w:LsdException Locked="false" Priority="31" QFormat="true" Name="Subtle Reference"/> <w:LsdException Locked="false" Priority="32" QFormat="true" Name="Intense Reference"/> <w:LsdException Locked="false" Priority="33" QFormat="true" Name="Book Title"/> <w:LsdException Locked="false" Priority="37" SemiHidden="true" UnhideWhenUsed="true" Name="Bibliography"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="TOC Heading"/> <w:LsdException Locked="false" Priority="41" Name="Plain Table 1"/> <w:LsdException Locked="false" Priority="42" Name="Plain Table 2"/> <w:LsdException Locked="false" Priority="43" Name="Plain Table 3"/> <w:LsdException Locked="false" Priority="44" Name="Plain Table 4"/> <w:LsdException Locked="false" Priority="45" Name="Plain Table 5"/> <w:LsdException Locked="false" Priority="40" Name="Grid Table Light"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 1"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 1"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 1"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 1"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 1"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 1"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 1"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 2"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 2"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 2"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 2"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 2"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 2"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 2"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 3"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 3"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 3"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 3"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 3"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 3"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 3"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 4"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 4"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 4"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 4"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 4"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 4"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 4"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 5"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 5"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 5"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 5"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 5"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 5"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 5"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 6"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 6"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 6"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 6"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 6"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 6"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 6"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 1"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 1"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 1"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 1"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 1"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 1"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 1"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 2"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 2"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 2"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 2"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 2"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 2"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 2"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 3"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 3"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 3"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 3"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 3"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 3"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 3"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 4"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 4"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 4"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 4"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 4"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 4"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 4"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 5"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 5"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 5"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 5"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 5"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 5"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 5"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 6"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 6"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 6"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 6"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 6"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 6"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 6"/> </w:LatentStyles></xml><![endif]--><!--[if gte mso 10]><style> /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:8.0pt; mso-para-margin-left:0in; line-height:107%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri",sans-serif; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} </style><![endif]--> <br /><div style="margin-bottom: .0001pt; margin: 0in;"><span style="color: black; font-family: &quot;arial&quot; , sans-serif; font-size: 10.0pt;">So it was time to see what following the Import Metadata option from File menu after deleting everything in the Physical Layer.</span></div><div style="margin-bottom: .0001pt; margin: 0in;"><br /></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"></span> <div class="separator" style="clear: both; text-align: center;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><a href="https://1.bp.blogspot.com/-UU63wKe4OaY/WZUZYIDUrRI/AAAAAAAAK9c/gFg7FhAxoY8vqOBErwFKU7BdtWR9G9ElQCEwYBhgL/s1600/08.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" data-original-height="832" data-original-width="783" height="400" src="https://1.bp.blogspot.com/-UU63wKe4OaY/WZUZYIDUrRI/AAAAAAAAK9c/gFg7FhAxoY8vqOBErwFKU7BdtWR9G9ElQCEwYBhgL/s400/08.png" width="376" /></span></a></span></span></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"></span><br /><!--[if gte mso 9]><xml> <o:OfficeDocumentSettings> <o:RelyOnVML/> <o:AllowPNG/> </o:OfficeDocumentSettings></xml><![endif]--><!--[if gte mso 9]><xml> <w:WordDocument> <w:View>Normal</w:View> <w:Zoom>0</w:Zoom> <w:TrackMoves/> <w:TrackFormatting/> <w:PunctuationKerning/> <w:ValidateAgainstSchemas/> <w:SaveIfXMLInvalid>false</w:SaveIfXMLInvalid> <w:IgnoreMixedContent>false</w:IgnoreMixedContent> <w:AlwaysShowPlaceholderText>false</w:AlwaysShowPlaceholderText> <w:DoNotPromoteQF/> <w:LidThemeOther>EN-US</w:LidThemeOther> <w:LidThemeAsian>X-NONE</w:LidThemeAsian> <w:LidThemeComplexScript>X-NONE</w:LidThemeComplexScript> <w:Compatibility> <w:BreakWrappedTables/> <w:SnapToGridInCell/> <w:WrapTextWithPunct/> <w:UseAsianBreakRules/> <w:DontGrowAutofit/> <w:SplitPgBreakAndParaMark/> <w:EnableOpenTypeKerning/> <w:DontFlipMirrorIndents/> <w:OverrideTableStyleHps/> </w:Compatibility> <m:mathPr> <m:mathFont m:val="Cambria Math"/> <m:brkBin m:val="before"/> <m:brkBinSub m:val="&#45;-"/> <m:smallFrac m:val="off"/> <m:dispDef/> <m:lMargin m:val="0"/> <m:rMargin m:val="0"/> <m:defJc m:val="centerGroup"/> <m:wrapIndent m:val="1440"/> <m:intLim m:val="subSup"/> <m:naryLim m:val="undOvr"/> </m:mathPr></w:WordDocument></xml><![endif]--><!--[if gte mso 9]><xml> <w:LatentStyles DefLockedState="false" DefUnhideWhenUsed="false" DefSemiHidden="false" DefQFormat="false" DefPriority="99" LatentStyleCount="371"> <w:LsdException Locked="false" Priority="0" QFormat="true" Name="Normal"/> <w:LsdException Locked="false" Priority="9" QFormat="true" Name="heading 1"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 2"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 3"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 4"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 5"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 6"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 7"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 8"/> <w:LsdException Locked="false" Priority="9" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="heading 9"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 6"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 7"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 8"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index 9"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 1"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 2"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 3"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 4"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 5"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 6"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 7"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 8"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" Name="toc 9"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Normal Indent"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="footnote text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="annotation text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="header"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="footer"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="index heading"/> <w:LsdException Locked="false" Priority="35" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="caption"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="table of figures"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="envelope address"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="envelope return"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="footnote reference"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="annotation reference"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="line number"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="page number"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="endnote reference"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="endnote text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="table of authorities"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="macro"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="toa heading"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Bullet 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Number 5"/> <w:LsdException Locked="false" Priority="10" QFormat="true" Name="Title"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Closing"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Signature"/> <w:LsdException Locked="false" Priority="1" SemiHidden="true" UnhideWhenUsed="true" Name="Default Paragraph Font"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text Indent"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="List Continue 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Message Header"/> <w:LsdException Locked="false" Priority="11" QFormat="true" Name="Subtitle"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Salutation"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Date"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text First Indent"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text First Indent 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Note Heading"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text Indent 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Body Text Indent 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Block Text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Hyperlink"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="FollowedHyperlink"/> <w:LsdException Locked="false" Priority="22" QFormat="true" Name="Strong"/> <w:LsdException Locked="false" Priority="20" QFormat="true" Name="Emphasis"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Document Map"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Plain Text"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="E-mail Signature"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Top of Form"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Bottom of Form"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Normal (Web)"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Acronym"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Address"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Cite"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Code"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Definition"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Keyboard"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Preformatted"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Sample"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Typewriter"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="HTML Variable"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Normal Table"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="annotation subject"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="No List"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Outline List 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Outline List 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Outline List 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Simple 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Simple 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Simple 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Classic 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Classic 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Classic 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Classic 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Colorful 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Colorful 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Colorful 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Columns 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 6"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 7"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Grid 8"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 4"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 5"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 6"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 7"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table List 8"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table 3D effects 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table 3D effects 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table 3D effects 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Contemporary"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Elegant"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Professional"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Subtle 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Subtle 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Web 1"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Web 2"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Web 3"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Balloon Text"/> <w:LsdException Locked="false" Priority="39" Name="Table Grid"/> <w:LsdException Locked="false" SemiHidden="true" UnhideWhenUsed="true" Name="Table Theme"/> <w:LsdException Locked="false" SemiHidden="true" Name="Placeholder Text"/> <w:LsdException Locked="false" Priority="1" QFormat="true" Name="No Spacing"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading"/> <w:LsdException Locked="false" Priority="61" Name="Light List"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3"/> <w:LsdException Locked="false" Priority="70" Name="Dark List"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 1"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 1"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 1"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 1"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 1"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 1"/> <w:LsdException Locked="false" SemiHidden="true" Name="Revision"/> <w:LsdException Locked="false" Priority="34" QFormat="true" Name="List Paragraph"/> <w:LsdException Locked="false" Priority="29" QFormat="true" Name="Quote"/> <w:LsdException Locked="false" Priority="30" QFormat="true" Name="Intense Quote"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 1"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 1"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 1"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 1"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 1"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 1"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 1"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 1"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 2"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 2"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 2"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 2"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 2"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 2"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 2"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 2"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 2"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 2"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 2"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 2"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 2"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 2"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 3"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 3"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 3"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 3"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 3"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 3"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 3"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 3"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 3"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 3"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 3"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 3"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 3"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 3"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 4"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 4"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 4"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 4"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 4"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 4"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 4"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 4"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 4"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 4"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 4"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 4"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 4"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 4"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 5"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 5"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 5"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 5"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 5"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 5"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 5"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 5"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 5"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 5"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 5"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 5"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 5"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 5"/> <w:LsdException Locked="false" Priority="60" Name="Light Shading Accent 6"/> <w:LsdException Locked="false" Priority="61" Name="Light List Accent 6"/> <w:LsdException Locked="false" Priority="62" Name="Light Grid Accent 6"/> <w:LsdException Locked="false" Priority="63" Name="Medium Shading 1 Accent 6"/> <w:LsdException Locked="false" Priority="64" Name="Medium Shading 2 Accent 6"/> <w:LsdException Locked="false" Priority="65" Name="Medium List 1 Accent 6"/> <w:LsdException Locked="false" Priority="66" Name="Medium List 2 Accent 6"/> <w:LsdException Locked="false" Priority="67" Name="Medium Grid 1 Accent 6"/> <w:LsdException Locked="false" Priority="68" Name="Medium Grid 2 Accent 6"/> <w:LsdException Locked="false" Priority="69" Name="Medium Grid 3 Accent 6"/> <w:LsdException Locked="false" Priority="70" Name="Dark List Accent 6"/> <w:LsdException Locked="false" Priority="71" Name="Colorful Shading Accent 6"/> <w:LsdException Locked="false" Priority="72" Name="Colorful List Accent 6"/> <w:LsdException Locked="false" Priority="73" Name="Colorful Grid Accent 6"/> <w:LsdException Locked="false" Priority="19" QFormat="true" Name="Subtle Emphasis"/> <w:LsdException Locked="false" Priority="21" QFormat="true" Name="Intense Emphasis"/> <w:LsdException Locked="false" Priority="31" QFormat="true" Name="Subtle Reference"/> <w:LsdException Locked="false" Priority="32" QFormat="true" Name="Intense Reference"/> <w:LsdException Locked="false" Priority="33" QFormat="true" Name="Book Title"/> <w:LsdException Locked="false" Priority="37" SemiHidden="true" UnhideWhenUsed="true" Name="Bibliography"/> <w:LsdException Locked="false" Priority="39" SemiHidden="true" UnhideWhenUsed="true" QFormat="true" Name="TOC Heading"/> <w:LsdException Locked="false" Priority="41" Name="Plain Table 1"/> <w:LsdException Locked="false" Priority="42" Name="Plain Table 2"/> <w:LsdException Locked="false" Priority="43" Name="Plain Table 3"/> <w:LsdException Locked="false" Priority="44" Name="Plain Table 4"/> <w:LsdException Locked="false" Priority="45" Name="Plain Table 5"/> <w:LsdException Locked="false" Priority="40" Name="Grid Table Light"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 1"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 1"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 1"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 1"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 1"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 1"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 1"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 2"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 2"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 2"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 2"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 2"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 2"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 2"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 3"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 3"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 3"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 3"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 3"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 3"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 3"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 4"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 4"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 4"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 4"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 4"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 4"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 4"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 5"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 5"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 5"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 5"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 5"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 5"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 5"/> <w:LsdException Locked="false" Priority="46" Name="Grid Table 1 Light Accent 6"/> <w:LsdException Locked="false" Priority="47" Name="Grid Table 2 Accent 6"/> <w:LsdException Locked="false" Priority="48" Name="Grid Table 3 Accent 6"/> <w:LsdException Locked="false" Priority="49" Name="Grid Table 4 Accent 6"/> <w:LsdException Locked="false" Priority="50" Name="Grid Table 5 Dark Accent 6"/> <w:LsdException Locked="false" Priority="51" Name="Grid Table 6 Colorful Accent 6"/> <w:LsdException Locked="false" Priority="52" Name="Grid Table 7 Colorful Accent 6"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 1"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 1"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 1"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 1"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 1"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 1"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 1"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 2"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 2"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 2"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 2"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 2"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 2"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 2"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 3"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 3"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 3"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 3"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 3"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 3"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 3"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 4"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 4"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 4"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 4"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 4"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 4"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 4"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 5"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 5"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 5"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 5"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 5"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 5"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 5"/> <w:LsdException Locked="false" Priority="46" Name="List Table 1 Light Accent 6"/> <w:LsdException Locked="false" Priority="47" Name="List Table 2 Accent 6"/> <w:LsdException Locked="false" Priority="48" Name="List Table 3 Accent 6"/> <w:LsdException Locked="false" Priority="49" Name="List Table 4 Accent 6"/> <w:LsdException Locked="false" Priority="50" Name="List Table 5 Dark Accent 6"/> <w:LsdException Locked="false" Priority="51" Name="List Table 6 Colorful Accent 6"/> <w:LsdException Locked="false" Priority="52" Name="List Table 7 Colorful Accent 6"/> </w:LatentStyles></xml><![endif]--><!--[if gte mso 10]><style> /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:8.0pt; mso-para-margin-left:0in; line-height:107%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri",sans-serif; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} </style><![endif]--> <br /><div style="margin-bottom: .0001pt; margin: 0in;"><span style="color: black; font-family: &quot;arial&quot; , sans-serif; font-size: 10.0pt;">I selected the available Connection Type - Essbase 9+</span>&nbsp;</div><div style="margin-bottom: .0001pt; margin: 0in;"></div><br /><div class="separator" style="clear: both; text-align: center;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><a href="https://3.bp.blogspot.com/-a0TldBeErTM/WZUZYEprMWI/AAAAAAAAK9c/RkQlVNnlcwwBuBdwbgXzW75TtCL_6uV-ACEwYBhgL/s1600/09.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" data-original-height="1044" data-original-width="1600" height="260" src="https://3.bp.blogspot.com/-a0TldBeErTM/WZUZYEprMWI/AAAAAAAAK9c/RkQlVNnlcwwBuBdwbgXzW75TtCL_6uV-ACEwYBhgL/s400/09.png" width="400" />&nbsp;</span></a></span></span></div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;">set my connection to the public IP </span></span></span></div><div class="separator" style="clear: both; text-align: left;"><br /></div><div class="separator" style="clear: both; text-align: center;"><a href="https://2.bp.blogspot.com/-E3DTykBPTS4/WZcBc1stLiI/AAAAAAAAK-A/RWu-FtWCNOox92q7pJGe2nOQxffYPqqXQCLcBGAs/s1600/10.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1048" data-original-width="1600" height="261" src="https://2.bp.blogspot.com/-E3DTykBPTS4/WZcBc1stLiI/AAAAAAAAK-A/RWu-FtWCNOox92q7pJGe2nOQxffYPqqXQCLcBGAs/s400/10.png" width="400" /></a></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"></span></span><br /><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;">Once again navigated to my cube and imported it</span></span><br /><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span><br /><div class="separator" style="clear: both; text-align: center;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><a href="https://1.bp.blogspot.com/-T8P02aJoweg/WZUZYTNPMOI/AAAAAAAAK9c/Wlo7rASPo0s0yzKYxvi9jAkxhP4H8bHrwCEwYBhgL/s1600/11.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" data-original-height="1048" data-original-width="1600" height="261" src="https://1.bp.blogspot.com/-T8P02aJoweg/WZUZYTNPMOI/AAAAAAAAK9c/Wlo7rASPo0s0yzKYxvi9jAkxhP4H8bHrwCEwYBhgL/s400/11.png" width="400" /></span></a></span></span></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"></span></span><br /><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"></span></span><br /><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;">This time it worked</span></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"></span></span><br /><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span><br /><div class="separator" style="clear: both; text-align: center;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><a href="https://3.bp.blogspot.com/-ec62ioI7KXo/WZUZYaQw5RI/AAAAAAAAK9c/Aqn66Mc2uaULx5IZCA5ORoSixaLndC1YwCEwYBhgL/s1600/12.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" data-original-height="1048" data-original-width="1600" height="261" src="https://3.bp.blogspot.com/-ec62ioI7KXo/WZUZYaQw5RI/AAAAAAAAK9c/Aqn66Mc2uaULx5IZCA5ORoSixaLndC1YwCEwYBhgL/s400/12.png" width="400" /></span></a></span></span></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;">And my cube shows up fine in the Physical Layer.</span></span><br /><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span><br /><div class="separator" style="clear: both; text-align: center;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><a href="https://4.bp.blogspot.com/-dLasCUb6hhY/WZUZYiA9rJI/AAAAAAAAK9c/z1fVQf0nPYkZrOC4DjGopwwTwAeQ_R2MQCEwYBhgL/s1600/13.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" data-original-height="690" data-original-width="693" height="318" src="https://4.bp.blogspot.com/-dLasCUb6hhY/WZUZYiA9rJI/AAAAAAAAK9c/z1fVQf0nPYkZrOC4DjGopwwTwAeQ_R2MQCEwYBhgL/s320/13.png" width="320" /></span></a></span></span></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;">After renaming the Database, Connection Pool adjusting Dimension properties to suit my needs I followed same process as an on-prem cube in the RPD.</span></span><br /><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span><br /><div class="separator" style="clear: both; text-align: center;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><a href="https://4.bp.blogspot.com/-5WLQvNfSqMs/WZUZYubOXSI/AAAAAAAAK9c/m9H7XMHKEtYp-bWlqUa202crlOCyMeMzQCEwYBhgL/s1600/14.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" data-original-height="954" data-original-width="1502" height="253" src="https://4.bp.blogspot.com/-5WLQvNfSqMs/WZUZYubOXSI/AAAAAAAAK9c/m9H7XMHKEtYp-bWlqUa202crlOCyMeMzQCEwYBhgL/s400/14.png" width="400" /></span></a></span></span></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;">It is time to save all my work and like any good RPD Developer I click yes on the Global Consistency Checker.</span></span><br /><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span><br /><div class="separator" style="clear: both; text-align: center;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><a href="https://4.bp.blogspot.com/-dPgAJBubeK8/WZUZY0oE0II/AAAAAAAAK9c/aq8g8xc42TAZCQnVf3GedVBAXGWM5XRSwCEwYBhgL/s1600/15.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" data-original-height="954" data-original-width="1502" height="253" src="https://4.bp.blogspot.com/-dPgAJBubeK8/WZUZY0oE0II/AAAAAAAAK9c/aq8g8xc42TAZCQnVf3GedVBAXGWM5XRSwCEwYBhgL/s400/15.png" width="400" /></span></a></span></span></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;">And what do we have here...? &nbsp;The warning indicates that my Database Type is not correct.</span></span><br /><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span><br /><div class="separator" style="clear: both; text-align: center;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><a href="https://2.bp.blogspot.com/-iQvQE_qioJw/WZUZY8HUVnI/AAAAAAAAK9c/2oj0UhNBQKUvwlu3BOYfYwuOqDoe2ZDPQCEwYBhgL/s1600/16.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" data-original-height="824" data-original-width="1600" height="205" src="https://2.bp.blogspot.com/-iQvQE_qioJw/WZUZY8HUVnI/AAAAAAAAK9c/2oj0UhNBQKUvwlu3BOYfYwuOqDoe2ZDPQCEwYBhgL/s400/16.png" width="400" /></span></a></span></span></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span><span style="font-size: x-small;">I</span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"> checked the Physical Layer properties of the database, I see that i have a couple more options than I did when I selected Import Metadata. &nbsp;I selected the most recent release version available in the list.</span></span><br /><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span><br /><div class="separator" style="clear: both; text-align: center;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><a href="https://3.bp.blogspot.com/-SwKUjii4uss/WZUZZLHU5pI/AAAAAAAAK9c/yfn1zX0wtf0Sy22_6a19ktT4d05Xs2HyQCEwYBhgL/s1600/17.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" data-original-height="970" data-original-width="817" height="400" src="https://3.bp.blogspot.com/-SwKUjii4uss/WZUZZLHU5pI/AAAAAAAAK9c/yfn1zX0wtf0Sy22_6a19ktT4d05Xs2HyQCEwYBhgL/s400/17.png" width="336" /></span></a></span></span></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;">This time when I saved and allowed for Global Consistency Check I did not get any warnings or errors!</span></span><br /><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span><br /><div class="separator" style="clear: both; text-align: center;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-size: x-small;"><a href="https://1.bp.blogspot.com/-HMd7JzeH9H8/WZUZZOyjOjI/AAAAAAAAK9c/QHGQIUbNcZMZIYzD3L0TkI0eXYjdRwD-wCEwYBhgL/s1600/18.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" data-original-height="954" data-original-width="1502" height="253" src="https://1.bp.blogspot.com/-HMd7JzeH9H8/WZUZZOyjOjI/AAAAAAAAK9c/QHGQIUbNcZMZIYzD3L0TkI0eXYjdRwD-wCEwYBhgL/s400/18.png" width="400" /></span></a></span></span></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;"><br /></span></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; font-size: x-small;">So it is time to submit an SR and&nbsp; deploy to the cloud! &nbsp;But that is topic for another post....</span></span><br /><br /> Wayne D. Van Sluys tag:blogger.com,1999:blog-7768091516190336427.post-3314267337444036097 Fri Aug 18 2017 12:13:00 GMT-0400 (EDT) How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau http://www.rittmanmead.com/blog/2017/08/how-was-game-of-thrones-s07-e05-tweet-analysis-with-kafka-bigquery-and-tableau/ <img src="http://www.rittmanmead.com/blog/content/images/2017/08/Images-View-3.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"><p>I don't trust statistics and personally believe that at least 74% of them are wrong.... <strong>but</strong> I bet nearly 100% of people with any interest in fantasy (or just any) TV shows are watching the 7th series of <a href="http://www.hbo.com/game-of-thrones">Game of Thrones</a> (GoT) by HBO. <br> If you are one of those, join me in the analysis of the latest tweets regarding the subject. Please be also aware that, if you're not on the latest episode, some spoilers may be revealed by this article. My suggestion is then to go back and watch the episodes first and then come back here for the analysis!</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Brace-Stark.jpg" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>If you aren't part of the above group then <code>¯\_(ツ)_/¯</code>. Still this post contains a lot of details on how to perform analysis on any tweet with Tableau and BigQuery together with Kafka sources and sink configurations. I'll leave to you to find another topic to put all this in practice.</p> <h1 id="overallsetup">Overall Setup</h1> <p>As described in my previous post on <a href="https://www.rittmanmead.com/blog/2017/07/analyzing-wimbledon-twitter-feeds-in-real-time-with-kafka-presto-and-oracle-dvd-v3/">analysing Wimbledon tweets</a> I've used Kafka for the tweet extraction phase. In this case however, instead of querying the data directly in Kafka with <a href="https://prestodb.io">Presto</a>, I'm landing the data into a <a href="https://cloud.google.com/bigquery/">Google BigQuery</a> Table. The last step is optional, since as in last blog I was directly querying Kafka, but in my opinion represents the perfect use case of all technologies: Kafka for streaming and BigQuery for storing and querying data. <br> The endpoint is represented by Tableau, which has a native connector to BigQuery. The following image represents the complete flow</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Overall-Flow.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>One thing to notice: at this point in time I'm using a on-premises installation of Kafka which I kept from my previous blog. However since source and target are natively cloud application I could easily move also Kafka in the cloud using for example the recently announced <a href="https://www.confluent.io/confluent-cloud/">Confluent Kafka as a Service</a>. </p> <p>Now let's add some details about the overall setup.</p> <h2 id="kafka">Kafka</h2> <p>For the purpose of this blog post I've switched from the <a href="https://kafka.apache.org/downloads">original Apache Kafka</a> distribution to the <a href="https://www.confluent.io/download/">Confluent open source</a> one. I've chosen the Confluent distribution since it includes the <a href="http://docs.confluent.io/current/connect/intro.html">Kafka Connect</a> which is </p> <blockquote> <p>A framework for scalably and reliably streaming data between Apache Kafka and other data systems</p> </blockquote> <p>Using this framework anybody can write a connector to push data from any system (Source Connector) to Kafka or pull data from it (Sink Connector). <a href="https://www.confluent.io/product/connectors/">This</a> is a list of available connectors developed and maintained either from Confluent or from the community. Moreover Kafka Connect provides the benefit of parsing the message body and storing it in Avro format which makes it easier to access and faster to retrieve.</p> <h3 id="kafkasourcefortwitter">Kafka Source for Twitter</h3> <p>In order to source from Twitter I've been using <a href="https://github.com/jcustenborder/kafka-connect-twitter">this connector</a>. The setup is pretty easy: copy the source folder named <code>kafka-connect-twitter-master</code> under <code>$CONFLUENT_HOME/share/java</code> and modify the file <code>TwitterSourceConnector.properties</code> located under the <code>config</code> subfolder in order to include the connection details and the topics.</p> <p>The configuration file in my case looked like the following:</p> <pre><code>name=connector1 tasks.max=1 connector.class=com.github.jcustenborder.kafka.connect.twitter.TwitterSourceConnector # Set these required values twitter.oauth.accessTokenSecret=&lt;TWITTER_TOKEN_SECRET&gt; process.deletes=false filter.keywords=#got,gameofthrones,stark,lannister,targaryen kafka.status.topic=rm.got kafka.delete.topic=rm.got twitter.oauth.consumerSecret=&lt;TWITTER_CONSUMER_SECRET&gt; twitter.oauth.accessToken=&lt;TWITTER_ACCESS_TOKEN&gt; twitter.oauth.consumerKey=&lt;TWITTER_CONSUMER_KEY&gt; </code></pre> <p>Few things to notice:</p> <ul> <li><code>process.deletes=false</code>: I'll not delete any message from the stream</li> <li><code>kafka.status.topic=rm.got</code>: I'll write against a topic named <code>rm.got</code></li> <li><code>filter.keywords=#got,gameofthrones,stark,lannister,targaryen</code>: I'll take all the tweets with one of the following keywords included. The list could be expanded, this was just a test case.</li> </ul> <p>All the work is done! the next step is to start the Kafka Connect execution via the following call from <code>$CONFLUENT_HOME/share/java/kafka-connect-twitter</code></p> <pre><code>$CONFLUENT_HOME/bin/connect-standalone config/connect-avro-docker.properties config/TwitterSourceConnector.properties </code></pre> <p>I can see the flow of messages in Kafka using the <code>avro-console-consumer</code> command</p> <pre><code>./bin/kafka-avro-console-consumer --bootstrap-server localhost:9092 --property schema.registry.url=http://localhost:8081 --property print.key=true --topic twitter --from-beginning </code></pre> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/TweetFlow.gif" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>You can see (or maybe it's a little bit difficult from the GIF) that the message body was transformed from JSON to AVRO format, the following is an example</p> <pre><code>{"CreatedAt":{"long":1502444851000}, "Id":{"long":895944759549640704}, "Text":{"string":"RT @haranatom: Daenerys Targaryen\uD83D\uDE0D https://t.co/EGQRvLEPIM"}, [...] ,"WithheldInCountries":[]} </code></pre> <h2 id="kafkasinktobigquery">Kafka Sink to BigQuery</h2> <p>Once the data is in Kafka, the next step is push it to the selected datastore: BigQuery. I can rely on Kafka Connect also for this task, with the related code written and supported by the community and available in <a href="https://github.com/wepay/kafka-connect-bigquery">github</a>.</p> <p>All I had to do is to download the code and change the file <code>kcbq-connector/quickstart/properties/connector.properties</code></p> <pre><code>... topics=rm.got .. autoCreateTables=true autoUpdateSchemas=true ... # The name of the BigQuery project to write to project=&lt;NAME_OF_THE_BIGQUERY_PROJECT&gt; # The name of the BigQuery dataset to write to (leave the '.*=' at the beginning, enter your # dataset after it) datasets=.*=&lt;NAME_OF_THE_BIGQUERY_DATASET&gt; # The location of a BigQuery service account JSON key file keyfile=/home/oracle/Big-Query-Key.json </code></pre> <p>The changes included:</p> <ul> <li>the <strong>topic name</strong> to source from Kafka</li> <li>the <strong>project</strong>, <strong>dataset</strong> and <strong>Keyfile</strong> which are the connection parameters to BigQuery. Note that the Keyfile is automatically generated when creating a BigQuery service.</li> </ul> <p>After verifying the settings, as per Kafka connect instructions, I had to create the tarball of the connector and extract it's contents</p> <pre><code>cd /path/to/kafka-connect-bigquery/ ./gradlew clean confluentTarBall mkdir bin/jar/ &amp;&amp; tar -C bin/jar/ -xf bin/tar/kcbq-connector-*-confluent-dist.tar </code></pre> <p>The last step is to launch the connector by moving into the <code>kcbq-connector/quickstart/</code> subfolder and executing</p> <pre><code>./connector.sh </code></pre> <p>Note that you may need to specify the <code>CONFLUENT_DIR</code> if the Confluent installation home is not in a sibling directory</p> <pre><code>export CONFLUENT_DIR=/path/to/confluent </code></pre> <p>When everything start up without any error a table named <code>rm_got</code> (the name is automatically generated) appears in the BigQuery dataset I defined previously and starts populating.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/BigQuery.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>A side note: I encountered a <code>Java Heap Space</code> error during the run of the BigQuery sink. This was resolved by increasing the heap space setting of the connector via the following call</p> <pre><code>export KAFKA_HEAP_OPTS="-Xms512m -Xmx1g" </code></pre> <h1 id="bigquery">BigQuery</h1> <p>BigQuery, based on <a href="https://research.google.com/pubs/pub36632.html">Dremel's paper</a>, is Google's proposition for an enterprise cloud datawarehouse which combines speed and scalability with separate pricing for storage and compute. If the cost of storage is common knowledge in the IT world, the compute cost is a fairly new concept. What this means is that the cost of the same query can vary depending on how the data is organized. In Oracle terms, we are used to associating the query cost to the one defined in the explain plan. In BigQuery that concept is translated from "performance cost" to also "financial cost" of a query: the more data a single query has to scan, the higher is the cost for it. This makes the work of optimizing data structures not only visible performance wise but also on the financial side. </p> <p>For the purpose of the blog post, I had almost 0 settings to configure other than creating a <a href="https://cloud.google.com">Google Cloud Platform</a>, creating a BigQuery project and a dataset. </p> <p>During the Project creation phase, a <strong>Keyfile</strong> is generated and stored locally on the computer. This file contains all the credentials needed to connect to BigQuery from any external application, my suggestion is to store it in a secure place. </p> <pre><code>{ "type": "service_account", "project_id": "&lt;PROJECT_ID&gt;", "private_key_id": "&lt;PROJECT_KEY_ID&gt;", "private_key": "&lt;PRIVATE_KEY&gt;", "client_email": "&lt;E-MAIL&gt;", "client_id": "&lt;ID&gt;", "auth_uri": "https://accounts.google.com/o/oauth2/auth", "token_uri": "https://accounts.google.com/o/oauth2/token", "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs", "client_x509_cert_url": "&lt;URL&gt;" } </code></pre> <p>This file is used in the Kafka sink as we saw above.</p> <h1 id="tableau">Tableau</h1> <p>Once the data is landed in BigQuery, It's time to analyse it with Tableau! <br> The Connection is really simple: from Tableau home I just need to select <code>Connect-&gt; To a Server -&gt; Google BigQuery</code>, fill in the connection details and select the project and datasource.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Tableau-BigQuery.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>An important feature to set is the <strong>Use Legacy SQL</strong> checkbox in the datasource definition. Without this setting checked I wasn't able to properly query the BigQuery datasource. This is due to the fact that "Standard SQL" doesn't support nested columns while Legacy SQL (also known as BigQuery SQL) does, for more info check the <a href="http://onlinehelp.tableau.com/current/pro/desktop/en-us/examples_googlebigquery.html#switch_to_bql">related tableau website</a>.</p> <h1 id="analysingthedata">Analysing the data</h1> <p>Now it starts the fun part: analysing the data! The integration between Tableau and BigQuery automatically exposes all the columns of the selected tables together with the correctly mapped datatypes, so I can immediately start playing with the dataset without having to worry about datatype conversions or date formats. I can simply include in the analysis the <code>CreatedAt</code> date and the <code>Number of Records</code> measure (named <code># of Tweets</code>) and display the number of tweets over time.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/--of-Tweets.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>Now I want to analyse where the tweets are coming from. I can use using the the <code>Place.Country</code> or the <code>Geolocation.Latitude</code> and <code>Geolocation.Longitude</code> fields in the tweet detail. Latitute and Longitude are more detailed while the Country is rolled up at state level, but both solutions have the same problem: they are available only for tweets with geolocation activated. </p> <p>After adding <code>Place.Country</code> and <code># of Tweets</code> in the canvas, I can then select the map as visualization. Two columns <code>Latitude (generated)</code> and <code>Longitude (generated)</code> are created on the fly mapping the country locations and the selected visualization is shown.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Map-1.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>However as mentioned before, this map shows only a subset of the tweets since the majority of tweets (almost 99%) has no location.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Nulls-In-Location.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>The fields <code>User.Location</code> and <code>User.TimeZone</code> suffer from a different problem: either are null or the possible values are not coming from a predefined list but are left to the creativity of the account owner which can type whatever string. As you can see, it seems we have some tweets coming from directly from <strong>Winterfell</strong>, <strong>Westeros</strong>, and interesting enough... <strong>Hogwarts</strong>!</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Location-and-TimeZone.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>Checking the most engaged accounts based on <code>User.Name</code> field clearly shows that <strong>Daenerys</strong> and <strong>Jon Snow</strong> take the time to tweet between fighting Cercei and the Whitewalkers.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Authors.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>The field <code>User.Lang</code> can be used to identify the language of the User. However, when analysing the raw data, it can be noticed that there are language splits for regional language settings (note <code>en</code> vs <code>en-gb</code>). We can solve the problem by creating a new field <code>User.Lang.Clean</code> taking only the first part of the string with a formula like</p> <pre><code>IF FIND([User.Lang],'-') =0 THEN [User.Lang] ELSE LEFT([User.Lang],FIND([User.Lang],'-')-1) END </code></pre> <p>With the interesting result of Italian being the 4th most used language, overtaking portuguese, and showing the high interest in the show in my home country.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Language-1.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <h2 id="characterandhouseanalysis">Character and House Analysis</h2> <p>Still with me? So far we've done some pretty basic analysis on top of pre-built fields or with little transformations... now it's time to go deep into the tweet's <code>Text</code> field and check what the people are talking about!</p> <p>The first thing I wanted to do is check mentions about the characters and related houses. The more a house is mentioned, the more should be relevant correct? <br> The first text analysis I want to perform was <strong>Stark</strong> vs <strong>Targaryen</strong> mention war: showing how many tweets were mentioning both, only one or none of two of the main houses. I achieved it with the below <code>IF</code> statement</p> <pre><code>IF contains(upper([Text]), 'STARK') AND contains(upper([Text]),'TARGARYEN') THEN 'Both' ELSEIF contains(upper([Text]), 'STARK') THEN 'Stark' ELSEIF contains(upper([Text]), 'TARGARYEN') THEN 'Targaryen' ELSE 'None' END </code></pre> <p>With the results supporting the house <strong>Stark</strong></p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Mentions-by-Family.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>I can do the same at single character level counting the mentions on separate columns like for <strong>Jon Snow</strong></p> <pre><code>IIF(contains(upper([Text]), 'JON') OR contains(upper([Text]),'SNOW'), 1,0) </code></pre> <p>Note the <code>OR</code> condition since I want to count as mentions both the words <code>JON</code> and <code>SNOW</code> since those can uniquely be referred at the same character. Similarly I can create a column counting the mentions to <strong>Arya Stark</strong> with the following formula</p> <pre><code>IIF(contains(upper([Text]), 'ARYA'), 1,0) </code></pre> <p>Note in this case I'm filtering only the name (<code>ARYA</code>) since <em>Stark</em> can be a reference to multiple characters (Sansa, Bran ...). I created several columns like the two above for some characters and displayed them in a histogram ordered by <code># of Mentions</code> in descending order.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/mentions-by-Character.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>As expected, after looking at the Houses results above, Jon Snow is leading the user mentions with a big margin over the others with Daenerys in second place.</p> <p>The methods mentioned above however have some big limitations:</p> <ul> <li>I need to create a different column for every character/house I want to analyse</li> <li>The formula complexity increases if I want to analyse more houses/characters at the same time</li> </ul> <p>My goal would be to have an Excel file, where I set the research Key (like <code>JON</code> and <code>SNOW</code>) together with the related character and house and mash this data with the BigQuery table.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Table-of-References.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>The joining key would be like </p> <pre><code>CONTAINS([BigQuery].[Text], [Excel].[Key]) &gt;0 </code></pre> <p>Unfortunately Tableau allows only <code>=</code> operators in text joining conditions during data blending making the above syntax impossible to implement. I have now three options:</p> <ul> <li><strong>Give Up</strong>: Never if there is still hope!</li> <li><strong>Move the Excel into a BigQuery table</strong> and resolve the problem there by writing a view on top of the data: works but increases the complexity on BigQuery side, plus most Tableau users will not have write access to related datasources.</li> <li><strong>Find an alternative way of joining the data</strong>: If the <code>CONTAINS</code> join is not possible during data-blending phase, I may use it a little bit later...</li> </ul> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Tyrion-Drink.jpg" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p><mark>Warning: the method mentioned below is not the optimal performance wise and should be used carefully since it causes data duplication if not handled properly.</mark></p> <p>Without the option of using the <code>CONTAINS</code> I had to create a <strong>cartesian join</strong> during data-blending phase. By using a cartesian join every row in the BigQuery table is repeated for every row in the Excel table. I managed to create a cartesian join by simply put a <code>1-1</code> condition in the data-blending section.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/1-1-Join.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>I can then apply a filter on the resulting dataset to keep only the BigQuery rows mentioning one (or more) <code>Key</code> from the Excel file with the following formula.</p> <pre><code>IIF(CONTAINS(UPPER([Text]),[Key]),[Id],NULL) </code></pre> <p>This formula filters the tweet <code>Id</code> where the Excel's <code>[Key]</code> field is contained in the <code>UPPER([Text])</code> coming from Twitter. Since there are multiple Keys assigned to the same character/house (see Jon Snow with both keywords <code>JON</code> and <code>SNOW</code>) the aggregation for this column is <strong>count distinct</strong> which in Tableau is achieved with <code>COUNTD</code> formula. <br> I can now simply drag the <code>Name</code> from the Excel file and the <code># of Mentions</code> column with the above formula and aggregation method as count distinct.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/--of-Mentions-by-Character-2.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>The beauty of this solution is that now if I need to do the same graph by house, I don't need to create columns with new formulas, but simply remove the <code>Name</code> field and replace it with <code>Family</code> coming from the Excel file.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/--of-Mentions-by-Family-1.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>Also if I forgot a character or family I simply need to add the relevant rows in the Excel lookup file and reload it, nothing to change in the formulas.</p> <h2 id="sentimentanalysis">Sentiment Analysis</h2> <p>Another goal I had in mind when analysing GoT data was the <strong>sentiment analysis</strong> of tweets and the average sentiment associated to a character or house. Doing sentiment analysis in Tableau is not too hard, since we can reuse already existing packages coming from R. </p> <p>For the Tableau-R integration to work I had to install and execute the <code>RServe</code> package from a workstation where R was already installed and set the connection in Tableau. More details on this configuration can be found in <a href="https://community.tableau.com/docs/DOC-9916">Tableau documentation</a></p> <p>Once configured Tableau to call R functions it's time to analyse the sentiment. I used <a href="https://cran.r-project.org/web/packages/syuzhet/syuzhet.pdf">Syuzhet</a> package (previously downloaded) for this purpose. The <code>Sentiment</code> calculation is done by the following formula:</p> <pre><code>SCRIPT_INT( "library(syuzhet); r&lt;-(get_sentiment(.arg1,method = 'nrc'))", ATTR([Text])) </code></pre> <p>Where</p> <ul> <li><code>SCRIPT_INT</code>: The method will return an integer score for each Tweet with positives sentiments having positives scores and negative sentiments negative scores</li> <li><code>get_sentiment(.arg1,method = 'nrc')</code>: is the function used</li> <li><code>ATTR([Text])</code>: the input parameter of the function which is the tweet text</li> </ul> <p>At this point I can see the score associated to every tweet, and since that R package uses dictionaries, I limited my research to tweets in english language (filtering on the column <code>User.Lang.Clean</code> mentioned above by <code>en</code>).</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Sentiment-By-Tweet.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>The next step is to average the sentiment by character, seems an easy step but devil is in the details! Tableau takes the output of the <code>SCRIPT_INT</code> call to R as aggregated metric, thus not giving any visual options to re-aggregate! Plus the tweet <code>Text</code> field must be present in the layout for the sentiment to be calculated otherwise the metric results <code>NULL</code>.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Cade.jpg" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>Fortunately there are <a href="http://onlinehelp.tableau.com/current/pro/desktop/en-us/functions_functions_tablecalculation.html">functions</a>, and specifically <strong>window functions</strong> like <code>WINDOW_AVG</code> allowing a post aggregation based of a formula defining the start and end. The other cool fact is that window function work per <strong>partition</strong> of the data and the start and end of the window can be defined using the <code>FIRST()</code> and <code>LAST()</code> functions.</p> <p>We can now create an aggregated version of our <code>Sentiment</code> column with the following formula</p> <pre><code>WINDOW_AVG(FLOAT([Sentiment]), FIRST(), LAST()) </code></pre> <p>This column will be repeated with the same value for all rows within the same <em>"partition"</em>, in this case the character <code>Name</code>.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Sentiment-Agg.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>Be aware that this solution doesn't re-aggregate the data, we'll still see the data by single tweet <code>Text</code> and character <code>Name</code>. However the metric is calculated at total per character so graphs can be displayed.</p> <p>I wanted to show a <strong>Scatter Plot</strong> based on the <code># of Mentions</code> and <code>Sentiment</code> of each character. With the window functions and the defined above it's as easy as dragging the fields in the proper place and select the scatter plot viz.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Default-View.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>The default view is not very informative since I can't really associate a character to its position in the chart until I go over the related image. Fortunately Tableau allows the definition of <a href="https://www.tableau.com/drive/custom-shapes">custom shapes</a> and I could easily assign character photos to related names.</p> <p><img width="600px" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau" src="http://www.rittmanmead.com/blog/content/images/2017/08/Images-View-2.png"></p> <p>If negative mentions for <strong>Littlefinger</strong> and <strong>Cercei</strong> was somehow expected, the characters with most negative sentiment are <strong>Sansa Stark</strong>, probably due to the mysterious letter found by Arya in Baelish room, and <strong>Ellaria Sand</strong>. On the opposite side we strangely see the <strong>Night King</strong> and more in general the <strong>WhiteWalkers</strong> with a very positive sentiment associated to them. Strange, this needs further investigation.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Night-king.jpg" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <h2 id="deepdiveonwhitewalkersandsansa">Deep Dive on Whitewalkers and Sansa</h2> <p>I can create a view per Character with associate tweets and sentiment score and filter it for the <strong>WhiteWalkers</strong>. Looks like there are great expectations for this character in the next episodes (the battle is coming) which are associated with positive sentiments.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Night-King-Tweets.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>When analysing the detail of the number of tweets falling in each sentiment score category it's clear why <strong>Sansa</strong> and <strong>Whitewalkers</strong> have such a different sentiment average. Both appear as normal distributions, but the center of the Whitewalkers curve is around 1 (positive sentiment) while for Sansa is between -1 and 0 (negative sentiment).</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Analysis-of-Tweets-by-Sentiment-1.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>This explanation however doesn't give me enough information, and want to understand more about what are the <strong>most used words</strong> included in tweets mentioning WhiteWalkers or Night King. </p> <p><mark>Warning: the method mentioned above is not the optimal performance wise and should be used carefully since it causes data duplication if not handled properly.</mark></p> <p>There is no easy way to do so directly in Tableau, even using R since all the functions expect the output size to be 1-1 with the input, like sentiment score and text. <br> For this purpose I created a view on top of the BigQuery table directly in Tableau using the <strong>New Custom SQL</strong> option. The SQL used is the following</p> <pre><code>SELECT ID, REPLACE(REPLACE(SPLIT(UPPER(TEXT),' '),'#',''),'@','') word FROM [Dataset.rm_got] </code></pre> <p>The <code>SPLIT</code> function divides the <code>Text</code> field in multiple rows one for every word separated by space. This is a very basic split and can of course be enhanced if needed. On top of it the SQL removes references to <code>#</code> and <code>@</code>. Since the view contains the tweet's <code>Id</code> field, this can be used to join this dataset with the main table.</p> <p>The graph showing the overall words belonging to characters is not really helpful since the amount of words (even if I included only the ones with more than e chars) is too huge to be analysed properly.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Overall-Words.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>When analysing the single words in particular tweets I can clearly see that the <strong>Whitewalkers</strong> sentiment is driven by words like <code>King</code>, <code>Iron</code>, <code>Throne</code> having a positive sentiment. On the other hand <strong>Sansa</strong> stark is penalized by words like <code>Kill</code> and <code>Fight</code> probably due to the possible troubles with Arya.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Word-Sentiment-Analysis.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>One thing to mention is that the word <a href="http://dictionary.cambridge.org/dictionary/english/stark">Stark</a> is classified with a negative sentiment due to the general english dictionary used for the scoring. This affects all the tweets and in particular the average scores of all the characters belonging to the <strong>House Stark</strong>. A new "GoT" dictionary should be created and used in order to avoid those kind of misinterpretations. </p> <p>Also when talking about "Game of Thrones", words like <code>Kill</code> or <code>Death</code> can have positive or negative meaning depending on the sentence, a imaginary tweet like </p> <blockquote> <p>Finally Arya kills Cercei</p> </blockquote> <p>Should have a positive sentiment for <strong>Arya</strong> and a negative for <strong>Cercei</strong>, but this is where automatic techniques of sentiment classification show their limits. Not even a new dictionary could help in this case.</p> <p>The chart below shows the percentage of words classified with positive (score 1 or 2) or negative (score -1 or -2) for the two selected characters. We can clearly see that <strong>Sansa</strong> has more negative words than positive as expected while <strong>Whitewalkers</strong> is on the opposite side.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Balance-Positive-Negative-2.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>Furthermore the overall sentiment for the two characters may be explained by the following graph. This shows for every sentence sentiment category (divided in bins <code>Positive</code>, <code>Neutral</code>, <code>Negative</code>), an histogram based on the count of words by single word sentiment. We can clearly see how words with positive sentiment are driving the <code>Positive</code> sentence category (and the opposite).</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Positive-sentence-positive-Words-1.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>Finally the last graph shows the words that have mostly impacted the overall positive and negative sentiment for both characters.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/08/Top-Words.png" alt="How was Game Of Thrones S07 E05? Tweet Analysis with Kafka, BigQuery and Tableau"></p> <p>We can clearly see that <strong>Sansa</strong> negative sentiment is due to <code>Stark</code>, <code>Hate</code> and <code>Victim</code>. On the other side <strong>Whitewalkers</strong> positive sentiment is due to words like <code>King</code> (Night King is the character) and <code>Finally</code> probably due to the battle coming in the next episode. As you can see there are also multiple instances of the <code>King</code> word due to different punctualization preceeding or following the world. I stated above that the BigQuery SQL extracting the words via the <code>SPLIT</code> function was very basic, we can now see why. Little enhancements in the function would aggregate properly the words.</p> <p>Are you still there? Do you wonder what's left? Well there is a whole set of analysis that can be done on top of this dataset, including checking the sentiment behaviour by time during the live event or comparing this week's dataset with the next episode's one. The latter may happen next week so... <a href="mailto:info+ftGot@rittmanmead.com">Keep in touch</a>!</p> <p>Hope you enjoyed the analysis... otherwise... Dracarys!</p> <p><center><iframe src="https://giphy.com/embed/fsHMfeK2YWfeM" width="480" height="266" frameborder="0" class="giphy-embed" allowfullscreen></iframe><p><a href="https://giphy.com/gifs/got-spoiler-dracarys-fsHMfeK2YWfeM">via GIPHY</a></p></center></p> Francesco Tisiot a00cb72b-7498-4226-840f-66db47335d98 Thu Aug 17 2017 11:54:42 GMT-0400 (EDT) Fun with Oracle Analytics Cloud (OAC): Creating Essbase Cubes http://blog.performancearchitects.com/wp/2017/08/16/fun-with-oracle-analytics-cloud-oac-creating-essbase-cubes/ <p>Author: Andrew Tauro, Performance Architects</p> <p>Just like there are multiple ways to skin a cat, there’s more than one way to create an Essbase cube in <a href="https://cloud.oracle.com/en_US/oac">Oracle Analytics Cloud (OAC)</a>. While the best way to migrate on-premise <a href="https://docs.oracle.com/cloud/latest/analytics-cloud/ESSUG/overview.htm#ESSUG-getting_started_1">Essbase cubes</a> to OAC is to use the standalone “EssbaseLCMUtility” tool, to create cubes from scratch there are three ways that I have used so far: use the Web UI; build an Application Workbook by hand (or from a template); or use the Cube builder. The latter two are the focus of this blog.</p> <p>The Application Workbook is essentially a Microsoft Excel workbook that contains a predefined set of tabs, with the contents arranged in a predetermined manner. What that means is the workbook has a bunch of tabs like this:</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at1.png"><img class="alignnone size-medium wp-image-2115" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at1-300x21.png" alt="" width="300" height="21" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at1-300x21.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at1-768x54.png 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at1-1024x73.png 1024w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at1-624x44.png 624w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p>Each of these tabs serves a particular purpose, but from what I can tell only the first two are a “must” when creating the application:</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at2.png"><img class="alignnone size-medium wp-image-2114" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at2-300x136.png" alt="" width="300" height="136" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at2-300x136.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at2-768x349.png 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at2-1024x466.png 1024w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at2-624x284.png 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at2.png 1163w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p>The “Essbase.Cube” worksheet defines the application and database names, which are required information when creating a cube. In addition, this sheet is used to define the cube dimensions:</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at3.png"><img class="alignnone size-medium wp-image-2113" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at3-300x161.png" alt="" width="300" height="161" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at3-300x161.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at3-768x413.png 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at3-624x335.png 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at3.png 994w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p>“Cube.Settings” and “Cube.Generations” define properties of the Essbase database. The former defines some crucial cube information, such as whether it is going to be a <a href="https://en.wikipedia.org/wiki/Essbase">block storage option (BSO) or aggregate storage option (ASO)</a> cube, and if it will allow for duplicate member names.</p> <p>The remaining tabs populate the dimensions (“Dim” tabs), data (“Data” tabs) and/or define calculation scripts (“Calc” tabs) for the cube. If you are familiar with building Essbase dimensions or data files and/or writing calc scripts, these will look very familiar.</p> <p>For those of you who are not familiar with these items, there is the option of using the Cube Designer.</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at4.png"><img class="alignnone size-medium wp-image-2112" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at4-300x76.png" alt="" width="300" height="76" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at4-300x76.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at4-768x195.png 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at4-1024x260.png 1024w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at4-624x158.png 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at4.png 1203w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p>This is an add-in for Microsoft Excel that you can download via Smart View from your OAC instance.</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at5.png"><img class="alignnone size-medium wp-image-2111" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at5-160x300.png" alt="" width="160" height="300" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at5-160x300.png 160w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at5.png 437w" sizes="(max-width: 160px) 100vw, 160px" /></a> <a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at6.png"><img class="alignnone size-medium wp-image-2110" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at6-155x300.png" alt="" width="155" height="300" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at6-155x300.png 155w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at6.png 435w" sizes="(max-width: 155px) 100vw, 155px" /></a></p> <p>The “Cube Designer” menu item provides tabbed screens for creating the application workbook. Walking through the tabs allows to setup the application workbook, and the “To Sheet” and “From Sheet” options facilitate reading from, and pushing to, the active workbook:</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at7.png"><img class="alignnone size-medium wp-image-2109" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at7-300x143.png" alt="" width="300" height="143" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at7-300x143.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at7-768x367.png 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at7-1024x490.png 1024w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at7-624x298.png 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/08/at7.png 1301w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p>Once complete, the cube can be created via the web user interface as an import.</p> <p>This has greatly reduced the complexity of creating Essbase cubes, and is just one of the ways that OAC is redefining the way we perform analytics using Essbase.</p> <p>As we explore the capabilities of OAC, we will continue to share our thoughts with you, so stay tuned. While you take this journey with us, if you have any questions on this, feel free to send us a note at <a href="mailto:communications@performancearchitects.com">communications@performancearchitects.com</a> and we will be in touch.</p> Melanie Mathews http://blog.performancearchitects.com/wp/?p=2108 Wed Aug 16 2017 05:20:55 GMT-0400 (EDT) #datavault 2.0, Hashes, one more time http://danlinstedt.com/allposts/datavaultcat/datavault-2-0-hashes-one-more-time/ A fact based look at Hashing and Sequences and Collision Strategies in Data Vault 2.0 Standards. Dan Linstedt http://danlinstedt.com/?p=2864 Tue Aug 15 2017 12:52:25 GMT-0400 (EDT) #datavault 2.0 and Hash Keys, Again? http://danlinstedt.com/allposts/datavaultcat/datavault-2-0-and-hash-keys-again/ Yet another dive in to Hash Keys and Data Vault 2.0 Dan Linstedt http://danlinstedt.com/?p=2861 Mon Aug 14 2017 15:07:12 GMT-0400 (EDT) FDMEE Custom Reports: Query Definition http://beyond-just-data.blogspot.com/2017/08/fdmee-custom-reports-query-definition.html <a href="https://3.bp.blogspot.com/-XHYgA5rigSk/WD5qSOQJczI/AAAAAAAAKgk/KAyyUnQ8WbQrUQKqmqaPtutuPjlGyJxHwCLcB/s1600/1-copy.png" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"></span></a><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;">When creating a new custom report one first needs data and the Query Definition is the starting point in the process.&nbsp; As I wrote in the first post of this series the report engine for FMDEE Reports is Oracle BI Publisher.&nbsp; The query engine within BI Publisher generates and XML file and that XML file is joined up with the Layout Template and Translation Template to produce the Report output.</span><br /><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span><br /><div class="separator" style="clear: both; text-align: center;"><span style="clear: left; float: left; font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="118" src="https://4.bp.blogspot.com/-JyoVhxsCdss/WKHc5FnUGTI/AAAAAAAAK2w/9Kvy656yGdI9HyxZguibSqqm3SlM1YoKACLcB/s400/query.png" width="400" /></span></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"></span><br /></span></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;">As with most of the ERP/EPM systems on the market the database supporting the </span><br /><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;">application can be quite intimidating and the vast number of tables confusing.&nbsp; Fortunately most custom reports are derivatives of existing reports and therefore using the&nbsp;existing&nbsp;Query Definitions as a starting point can be beneficial in learning the tables used and their purpose.</span><br /><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;">For this exercise I am interested in a report to list Locations and the Data Load Rules associated with that Location.&nbsp;</span><br /><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;">In the Query Definition section there is a Query that looks similar to what I need. &nbsp;I will copy the SQL from the 3 text boxes. &nbsp;The three different boxes allow for the use of inline parameters from prompts in the WHERE clause.</span><br /><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span><br /><div class="separator" style="clear: both; text-align: center;"><span style="clear: left; float: left; font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; margin-bottom: 1em; margin-right: 1em;"><img border="0" height="230" src="https://3.bp.blogspot.com/-XHYgA5rigSk/WD5qSOQJczI/AAAAAAAAKgk/KAyyUnQ8WbQrUQKqmqaPtutuPjlGyJxHwCLcB/s400/1-copy.png" width="400" /></span><span style="clear: left; float: left; font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; margin-bottom: 1em; margin-right: 1em;"><br /></span></div><div style="text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span><br /><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;">Since my FDMEE Repository is on Oracle Database I will paste the SQL into SQL Developer. &nbsp;Using [Ctrl] [F7] keys I can view the SQL in a nicely formatted layout.</span><br /><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span></div><div class="separator" style="clear: both; text-align: left;"><a href="https://2.bp.blogspot.com/-casgbzesQlU/WD5qRzrbPzI/AAAAAAAAKgg/JKL2-MUtXsABuHH667Rc--qO1E3Fgz9sQCEw/s1600/2-FormattedSQL.png" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em; text-align: center;"><img border="0" height="400" src="https://2.bp.blogspot.com/-casgbzesQlU/WD5qRzrbPzI/AAAAAAAAKgg/JKL2-MUtXsABuHH667Rc--qO1E3Fgz9sQCEw/s400/2-FormattedSQL.png" width="343" /></a></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;">Since I like to work in a Graphical view of my queries I will switch to the Query Builder view. This allows me to see the tables and joins easier.</span></div><div style="text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif; margin-left: 1em; margin-right: 1em; text-align: center;"></span></div><div class="separator" style="clear: both; text-align: center;"><br /></div><div class="separator" style="clear: both; text-align: left;"><a href="https://2.bp.blogspot.com/-3bgshwjaDCY/WD5qRyMSCNI/AAAAAAAAKgc/Qp1jX3TPrAwv5wibtydzF57n2Q0n38x-QCEw/s1600/3-QueryBuilder.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" height="343" src="https://2.bp.blogspot.com/-3bgshwjaDCY/WD5qRyMSCNI/AAAAAAAAKgc/Qp1jX3TPrAwv5wibtydzF57n2Q0n38x-QCEw/s400/3-QueryBuilder.png" width="400" /></span></a></div><div class="separator" style="clear: both; text-align: center;"><a href="https://2.bp.blogspot.com/-3bgshwjaDCY/WD5qRyMSCNI/AAAAAAAAKgc/Qp1jX3TPrAwv5wibtydzF57n2Q0n38x-QCEw/s1600/3-QueryBuilder.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"></span></a></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"></span><br /></span><br /><div style="margin-bottom: .0001pt; margin: 0in;"><span style="font-family: &quot;arial&quot; , sans-serif;">The following image show some of the more frequently used tables within the FDMEE database/schema and what information they contain.</span></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"></span><br /></span></div><div class="separator" style="clear: both; text-align: left;"><a href="https://4.bp.blogspot.com/-ocH2AGw2PSs/WKHbWzJmikI/AAAAAAAAK2k/z3JRbReQj2EXaQQq6CTqqaU4x61u_4K6wCLcB/s1600/Tabes.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="174" src="https://4.bp.blogspot.com/-ocH2AGw2PSs/WKHbWzJmikI/AAAAAAAAK2k/z3JRbReQj2EXaQQq6CTqqaU4x61u_4K6wCLcB/s400/Tabes.png" width="400" /></a></div><div class="separator" style="clear: both; text-align: left;"><br /></div><div style="margin-bottom: .0001pt; margin: 0in;"><span style="font-family: &quot;arial&quot; , sans-serif;">So now that I have a better understanding of the tables.<o:p></o:p></span></div><div style="margin-bottom: .0001pt; margin: 0in;"><br /></div><div style="margin-bottom: .0001pt; margin: 0in;"><span style="font-family: &quot;arial&quot; , sans-serif;">The report I want to produce is Data Load Rules per location and since the Data Load Rules are tied to Category I would like to know that as well.<o:p></o:p></span></div><div style="margin-bottom: .0001pt; margin: 0in;"><br /></div><div class="separator" style="clear: both;"></div><div style="margin-bottom: .0001pt; margin: 0in;"><span style="font-family: &quot;arial&quot; , sans-serif;">Since my report only needs 3 columns I will create a new query in SQL Developer and drag the AIF_BALANCE_RULES, TPOVPARTITION and TPOVCATEGORY tables into the Query Builder and Join them as shown below.&nbsp; Then I will select the columns I am interested in from the tables and finally I will set the sort order.&nbsp; One tip with Reports it is faster to sort the data on query rather than within the report layout.<o:p></o:p></span></div><div style="margin-bottom: .0001pt; margin: 0in;"><span style="font-family: &quot;arial&quot; , sans-serif;"><br /></span></div><div class="separator" style="clear: both; text-align: left;"><a href="https://2.bp.blogspot.com/-zVZ1UkruW0c/WD5qSCTAuDI/AAAAAAAAKgo/_U-TYWwZmT4_4KnWx0KdqsDdboVMiuI4gCEw/s1600/4-QueryBuilder.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" height="293" src="https://2.bp.blogspot.com/-zVZ1UkruW0c/WD5qSCTAuDI/AAAAAAAAKgo/_U-TYWwZmT4_4KnWx0KdqsDdboVMiuI4gCEw/s400/4-QueryBuilder.png" width="400" /></span></a></div><div class="separator" style="clear: both; text-align: center;"><a href="https://3.bp.blogspot.com/-XHYgA5rigSk/WD5qSOQJczI/AAAAAAAAKgk/KAyyUnQ8WbQrUQKqmqaPtutuPjlGyJxHwCLcB/s1600/1-copy.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"></span></a></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"></span><br /></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"></span></span><br /><div style="margin-bottom: .0001pt; margin: 0in;"><span style="font-family: &quot;arial&quot; , sans-serif;">I will need to switch back to the Worksheet view in order to collect the SQL for my query.<o:p></o:p></span><br /><span style="font-family: &quot;arial&quot; , sans-serif;"><br /></span></div><div class="separator" style="clear: both; text-align: left;"><a href="https://3.bp.blogspot.com/-Kg4jiZ9LZP8/WD5qSJcSC6I/AAAAAAAAKgs/9XLkGi9_2QsNTR5qDatLV3wi13KvrzV4gCEw/s1600/5-SQL.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" height="262" src="https://3.bp.blogspot.com/-Kg4jiZ9LZP8/WD5qSJcSC6I/AAAAAAAAKgs/9XLkGi9_2QsNTR5qDatLV3wi13KvrzV4gCEw/s400/5-SQL.png" width="400" /></span></a></div><br /><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"></span></span></div><div class="separator" style="margin-bottom: .0001pt; margin: 0in;"><span style="font-family: &quot;arial&quot; , sans-serif;">In FDMEE I will create a new query definition named Location Data Load Rules and paste into the 2 text areas; the SELECT statement up to the ORDER BY into the Select Clause and paste the ORDER BY statement into the Group By/Order By Clause.<o:p></o:p></span></div><div class="separator" style="margin-bottom: .0001pt; margin: 0in;"><span style="font-family: &quot;arial&quot; , sans-serif;"><br /></span></div><div class="separator" style="clear: both; text-align: left;"><a href="https://3.bp.blogspot.com/-ZR8hDmFoUXA/WD5qScBf2EI/AAAAAAAAKgw/0KwRBjfe6Zg__w_hD5QG6U-SEhMqbuOgACEw/s1600/6-Paste-NewQueryDef.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" height="157" src="https://3.bp.blogspot.com/-ZR8hDmFoUXA/WD5qScBf2EI/AAAAAAAAKgw/0KwRBjfe6Zg__w_hD5QG6U-SEhMqbuOgACEw/s400/6-Paste-NewQueryDef.png" width="400" /></span></a></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"></span><br /></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"></span></span><br /><div style="margin-bottom: .0001pt; margin: 0in;"><span style="font-family: &quot;arial&quot; , sans-serif;">Save the Definition and then click the Validate Query to make sure that FDMEE does not have any issues with the query.<o:p></o:p></span><br /><span style="font-family: &quot;arial&quot; , sans-serif;"><br /></span></div><div class="separator" style="clear: both; text-align: left;"><a href="https://1.bp.blogspot.com/-D3TywKRwZCw/WD5qSSm9_2I/AAAAAAAAKg0/WnsLCIgCCzUwJl311ei9xHDioDqCp8DUgCEw/s1600/7-Validate.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" height="170" src="https://1.bp.blogspot.com/-D3TywKRwZCw/WD5qSSm9_2I/AAAAAAAAKg0/WnsLCIgCCzUwJl311ei9xHDioDqCp8DUgCEw/s400/7-Validate.png" width="400" /></span></a></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"></span><br /></span><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"></span><br /></span><br /><div class="separator" style="margin-bottom: .0001pt; margin: 0in;"><span style="font-family: &quot;arial&quot; , sans-serif;">In order to work with the BIPublisher MSWord Template Builder I will need a sample data file. The Query Definition interface has a Generate XML button. Clicking this button allows me to create the sample XML data file. Typically it will return 25 rows of data.<o:p></o:p></span></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot;;"></span><br /></span><br /><div class="separator" style="clear: both; text-align: left;"><a href="https://4.bp.blogspot.com/-uGPSJ3_vZK8/WD5qSV_k3PI/AAAAAAAAKg4/8OL78t801v41Nm7ehzBREvAkWyA6n6vCQCEw/s1600/8-GenerateXML.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" height="206" src="https://4.bp.blogspot.com/-uGPSJ3_vZK8/WD5qSV_k3PI/AAAAAAAAKg4/8OL78t801v41Nm7ehzBREvAkWyA6n6vCQCEw/s400/8-GenerateXML.png" width="400" /></span></a></div><div class="separator" style="clear: both; text-align: left;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><br /></span></div><div style="margin-bottom: .0001pt; margin: 0in;"><span style="font-family: &quot;arial&quot; , sans-serif;">When I open the file I can see the data structure of the query. In the case the result of&nbsp;my simple query there is the Location name, the Category name and the Data Load Rule name for each row of data.&nbsp;<o:p></o:p></span></div><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"></span><br /></span><br /><div class="separator" style="clear: both; text-align: left;"><a href="https://3.bp.blogspot.com/-ArV4mTqAeug/WJjyBXYYzOI/AAAAAAAAK14/Yxp1gZ7ujtkEqGrMD_gVVMF4eh-9OpkMQCLcB/s1600/xml_output.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><span style="font-family: &quot;arial&quot; , &quot;helvetica&quot; , sans-serif;"><img border="0" height="286" src="https://3.bp.blogspot.com/-ArV4mTqAeug/WJjyBXYYzOI/AAAAAAAAK14/Yxp1gZ7ujtkEqGrMD_gVVMF4eh-9OpkMQCLcB/s320/xml_output.png" width="320" /></span></a></div><br /><div style="margin-bottom: .0001pt; margin: 0in;"><span style="font-family: &quot;arial&quot; , sans-serif;">Now that have a sample data set I can create the layout for my report. But that is the topic for another post.</span><b><span style="font-family: &quot;arial&quot; , sans-serif; font-size: 10.0pt;"><o:p></o:p></span></b></div> Wayne D. Van Sluys tag:blogger.com,1999:blog-7768091516190336427.post-7642577877769691734 Fri Aug 11 2017 15:26:00 GMT-0400 (EDT) Projection Process Case Study for Oracle Planning and Budgeting Cloud Service (PBCS) http://blog.performancearchitects.com/wp/2017/08/09/projection-process-case-study-for-oracle-planning-and-budgeting-cloud-service-pbcs/ <p>Author: Mike McLean, Performance Architects</p> <p>One of the requirements during a recent Performance Architects <a href="http://www.oracle.com/us/solutions/cloud/planning-budgeting/overview/index.html">Oracle Planning and Budgeting Cloud Service (PBCS)</a> implementation project was to create a solution for their projection process.   During discussions with the Budget Office, we learned that requirements included:</p> <ul> <li>Projections occur three times a year: <ul> <li>After Q1 actuals are complete</li> <li>After Q2 actuals are complete</li> <li>After Q3 actuals are complete</li> </ul> </li> <li>Historical actuals and budget data must be used to seed the projection scenario</li> <li>Revenues need to be seeded using one methodology, while expenses are seeded using another methodology</li> <li>After the projection scenario is seeded, additional adjustments may need to be made by department, fund, program, etc.</li> </ul> <p>The calculation to seed the “Projection” scenario contains several components:</p> <ol> <li>Q1 revenue and expense actuals were copied to “Q1 Projection&#8221;:<br /> <a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-1.png"><img class="alignnone size-medium wp-image-2102" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-1-300x191.png" alt="" width="300" height="191" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-1-300x191.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-1.png 404w" sizes="(max-width: 300px) 100vw, 300px" /></a></li> </ol> <ol start="2"> <li>Using prior year actuals, “% of YearTotal” was calculated for all revenue accounts for each month. See the example below for October (500 / 9,750 = 5.1%):<br /> <a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-2.png"><img class="alignnone size-medium wp-image-2101" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-2-300x32.png" alt="" width="300" height="32" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-2-300x32.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-2-768x82.png 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-2-1024x110.png 1024w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-2-624x67.png 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-2.png 1130w" sizes="(max-width: 300px) 100vw, 300px" /></a></li> </ol> <ol start="3"> <li>The monthly “% of YearTotal” is then multiplied by the “Budget YearTotal” value to calculate each month’s revenue. See the example below for October (5.1% x 8,100 = 415):<br /> <a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-3.png"><img class="alignnone size-medium wp-image-2100" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-3-300x76.png" alt="" width="300" height="76" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-3-300x76.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-3-768x195.png 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-3-1024x261.png 1024w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-3-624x159.png 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-3.png 1132w" sizes="(max-width: 300px) 100vw, 300px" /></a></li> </ol> <ol start="4"> <li>Expense accounts are calculated by taking the average of “Q1 Actuals” and loading that value into all of the out months:<br /> <a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-4.png"><img class="alignnone size-medium wp-image-2099" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-4-300x28.png" alt="" width="300" height="28" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-4-300x28.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-4-768x71.png 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-4-1024x94.png 1024w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-4-624x57.png 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-4.png 1140w" sizes="(max-width: 300px) 100vw, 300px" /></a></li> <li>After the business rule is launched, the results are displayed:<br /> <a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-5.png"><img class="alignnone size-medium wp-image-2098" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-5-300x41.png" alt="" width="300" height="41" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-5-300x41.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-5-768x105.png 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-5-1024x140.png 1024w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-5-624x85.png 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-5.png 1133w" sizes="(max-width: 300px) 100vw, 300px" /></a></li> </ol> <p>After the “Q1 Projection” is calculated, the Budget Office wanted the option to make additional adjustments.  This was accomplished by leveraging the hierarchy in the “Version” dimension.  We created a sibling of “Projection Calculated” and named the member “Projection Adjustments.”  The parent of those two members is “Projection Total”.</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-6.png"><img class="alignnone size-medium wp-image-2097" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-6-300x130.png" alt="" width="300" height="130" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-6-300x130.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-6-768x333.png 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-6-1024x444.png 1024w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-6-624x270.png 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/Projection-Process-6.png 1034w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p>Need help with your PBCS implementation project? Contact us at <a href="mailto:sales@performancearchitects.com">sales@performancearchitects.com</a> and we can help you out!</p> Melanie Mathews http://blog.performancearchitects.com/wp/?p=2096 Wed Aug 09 2017 06:02:18 GMT-0400 (EDT) Kscope17 Conference Analytics Part 3: Analytics & JavaScript Visualization http://redpillanalytics.com/kscope17part3/ <p><img width="300" height="200" src="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?fit=300%2C200" class="attachment-medium size-medium wp-post-image" alt="" srcset="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?w=1400 1400w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?resize=300%2C200 300w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?resize=768%2C512 768w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?resize=1024%2C682 1024w" sizes="(max-width: 300px) 100vw, 300px" data-attachment-id="5213" data-permalink="http://redpillanalytics.com/protected-kscope17-conference-analytics/phone/" data-orig-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?fit=1400%2C933" data-orig-size="1400,933" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;1.8&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;Canon PowerShot G7 X&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;1496681464&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;8.8&quot;,&quot;iso&quot;:&quot;1000&quot;,&quot;shutter_speed&quot;:&quot;0.05&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;1&quot;}" data-image-title="phone" data-image-description="" data-medium-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?fit=300%2C200" data-large-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?fit=1024%2C682" /></p><h3>Editor&#8217;s Note</h3> <p>This year, Red Pill Analytics was the Analytics Sponsor at ODTUG Kscope17. Our company motto is #challengeeverything – so we knew we wanted to do something different and unexpected while at the conference.<br /> What we eventually landed on was creating Analytics Stations using IoT technologies to show how an old school object, like a rotary phone, can be repurposed and turned into an interactive device.<br /> <a href="http://redpillanalytics.com/kscope17-analytics-hardware/">Part 1 focuses on hardware.</a><br /> <a href="http://redpillanalytics.com/protected-kscope17-conference-analytics/">Part 2 focuses on software.</a><br /> <a href="http://redpillanalytics.com/?p=5316&amp;preview=true">Part 3 focuses on Analytics &amp; JavaScript Visualization</a><br /> Kscope17 also used beacon technology to analyze conference attendee activities. Red Pill Analytics pulled that information through a REST API and told the story of Kscope17 using Oracle Data Visualization.</p> <hr /> <p><em>All of the relevant code for the project that is discussed below is open sourced via MIT license and can be found at: <a href="https://github.com/RedPillAnalytics/matrix-agent-phone" target="_blank" rel="noopener">https://github.com/RedPillAnalytics/matrix-agent-phone</a></em></p> <p>In the <a href="http://redpillanalytics.com/kscope17-analytics-hardware/">first post</a> we built the hardware for out IoT phone and in the <a href="http://redpillanalytics.com/protected-kscope17-conference-analytics/">second post </a>we configured the Raspberry Pi to act as a kiosk and run an Electron framework desktop application. With the idea that Electron would be the primary codebase for our project: to load, transform, and display our data. As well as to handle hardware interaction with the IoT phone.</p> <p>The first thing to know about Electron is that it is essentially a website running chrome less with native OS hardware and filesystem access. A perfect candidate for rapid development with easy styling and powerful feature sets because it allows use of D3.js (a powerful javascript visualization framework, <a href="https://d3js.org/" target="_blank" rel="noopener">https://d3js.org/</a>) and other javascript visualization frameworks to design amazing displays of data, whilst also leveraging a large base of libraries for sourcing and transforming data.</p> <p>In our visualization we used D3.js and Chart.js as well as some custom HTML, CSS and JS for the circular tiles.</p> <p>Our data was gathered via the AWS JavaScript SDK (<a href="https://aws.amazon.com/sdk-for-node-js/" target="_blank" rel="noopener">https://aws.amazon.com/sdk-for-node-js/</a>) and specifically via Amazon Athena (<a href="https://aws.amazon.com/athena/" target="_blank" rel="noopener">https://aws.amazon.com/athena/</a>) which allows you to make SQL calls against and Amazon S3 JSON based data lake.</p> Emily Carlsen http://redpillanalytics.com/?p=5316 Tue Aug 08 2017 15:55:12 GMT-0400 (EDT) Auto enabling APPROX_* function in the Oracle Database http://www.oralytics.com/2017/08/auto-enabling-approx-function-in-oracle.html <p>With the releases of 12.1 and 12.2 of Oracle Database we have seen some new functions that perform approximate calculations. These include:</p> <ul> <li>APPROX_COUNT_DISTINCT</li> <li>APPROX_COUNT_DISTINCT_DETAIL</li> <li>APPROX_COUNT_DISTINCT_AGG</li> <li>APPROX_MEDIAN</li> <li>APPROX_PERCENTILE</li> <li>APPROX_PERCENTILE_DETAIL</li> <li>APPROX_PERCENTILE_AGG</li></ul> <p>These functions can be used when approximate answers can be used instead of the exact answer. Yes can have many scenarios for these and particularly as we move into the big data world, the ability to process our data quickly is slightly more important and exact numbers. For example, is there really a difference between 40% of our customers being of type X versus 41%. The real answer to this is, 'It Depends!', but for a lot of analytical and advanced analytical methods this difference doesn't really make a difference.</p> <p>There are various reports of performance improvement of anything from 6x to 50x with the response times of the queries that are using these functions, instead of using the more traditional functions.</p> <p>If you are a BI or big data analyst and you have build lots of code and queries using the more traditional functions. But what if you now want to use the newer functions. Does this mean you have go and modify all the code you have written over the years? you can imagine getting approval to do this! </p> <p>The simple answer to this question is 'No'. No you don't have to change any code, but with some parameter changes for the DB or your session you can tell the database to automatically switch from using the traditional functions (count, etc) to the newer more optimised and significantly faster APPROX_* functions.</p> <p>So how can you do this magic?</p> <p>First let us see what the current settings values are:</p> <pre><br />SELECT name, value <br />FROM v$ses_optimizer_env <br />WHERE sid = sys_context('USERENV','SID') <br />AND name like '%approx%';<br /></pre> <p><img src="https://lh3.googleusercontent.com/-0V0Y6oD8d5M/WX8epvykGjI/AAAAAAAAMOs/k_mG0zxQBiwL2DHrfd7lWiTSAzwUDGJIQCHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="190" height=100" /></p> <p>Now let us run a query to test what happens using the default settings (on a table I have with 10,500 records).</p> <pre><br />set auto trace on<br /><br />select count(distinct cust_id) from test_inmemory;<br /><br />COUNT(DISTINCTCUST_ID)<br />----------------------<br /> 1500<br /><br /><br />Execution Plan<br />----------------------------------------------------------<br />Plan hash value: 2131129625<br /><br />--------------------------------------------------------------------------------------<br />| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |<br />--------------------------------------------------------------------------------------<br />| 0 | SELECT STATEMENT | | 1 | 13 | 70 (2)| 00:00:01 |<br />| 1 | SORT AGGREGATE | | 1 | 13 | | |<br />| 2 | VIEW | VW_DAG_0 | 1500 | 19500 | 70 (2)| 00:00:01 |<br />| 3 | HASH GROUP BY | | 1500 | 7500 | 70 (2)| 00:00:01 |<br />| 4 | TABLE ACCESS FULL| TEST_INMEMORY | 10500 | 52500 | 69 (0)| 00:00:01 |<br />--------------------------------------------------------------------------------------<br /></pre> <p>Let us now set the automatic usage of the APPROX_* function.</p> <pre><br />alter session set approx_for_aggregation = TRUE;<br /><br />SQL> select count(distinct cust_id) from test_inmemory;<br /><br />COUNT(DISTINCTCUST_ID)<br />----------------------<br /> 1495<br /><br /><br />Execution Plan<br />----------------------------------------------------------<br />Plan hash value: 1029766195<br /><br />---------------------------------------------------------------------------------------<br />| Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |<br />---------------------------------------------------------------------------------------<br />| 0 | SELECT STATEMENT | | 1 | 5 | 69 (0)| 00:00:01 |<br />| 1 | <strong>SORT AGGREGATE APPROX</strong>| | 1 | 5 | | |<br />| 2 | TABLE ACCESS FULL | TEST_INMEMORY | 10500 | 52500 | 69 (0)| 00:00:01 |<br />---------------------------------------------------------------------------------------<br /></pre> <p>We can see above that the APPROX_* equivalent function was used, and slightly less work. But we only used this on a very small table.</p> The full list of session level settings is: <pre><br />alter session set approx_for_aggregation = TRUE;<br />alter session set approx_for_aggregation = FALSE;<br /><br />alter session set approx_for_count_distinct = TRUE;<br />alter session set approx_for_count_distinct = FALSE;<br /><br />alter session set approx_for_percentile = 'PERCENTILE_CONT DETERMINISTIC';<br />alter session set approx_for_percentile = PERCENTILE_DISC;<br />alter session set approx_for_percentile = NONE;<br /></pre> <p>Or at a system wide level:</p> <pre><br />alter system set approx_for_aggregation = TRUE;<br />alter system set approx_for_aggregation = FALSE;<br /><br />alter system set approx_for_count_distinct = TRUE;<br />alter system set approx_for_count_distinct = FALSE;<br /><br />alter system set approx_for_percentile = 'PERCENTILE_CONT DETERMINISTIC';<br />alter system set approx_for_percentile = PERCENTILE_DISC;<br />alter system set approx_for_percentile = NONE;<br /></pre> <p>And to reset back to the default settings:</p> <pre><br />alter system reset approx_for_aggregation;<br />alter system reset approx_for_count_distinct;<br />alter system reset approx_for_percentile;<br /></pre> Brendan Tierney tag:blogger.com,1999:blog-4669933501315263808.post-6945119724143442775 Mon Aug 07 2017 11:46:00 GMT-0400 (EDT) #datavault Meetups! Australia / New Zealand! http://danlinstedt.com/allposts/news/datavault-meetups-australia-new-zealand/ Data Vault 1.0 and 2.0 Meetups in Australia and New Zealand in August 2017. I will be there in person!! don&#8217;t miss this chance to talk with me! Dan Linstedt http://danlinstedt.com/?p=2858 Mon Aug 07 2017 09:25:34 GMT-0400 (EDT) Six Plus One Types of Interviewers http://bi.abhinavagarwal.net/2017/08/six-plus-one-types-of-interviewers.html <div dir="ltr" style="text-align: left;" trbidi="on"><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://3.bp.blogspot.com/-ey9UDH_OiqU/WYC6N3cj4NI/AAAAAAAAOY0/ebdu5vl3X5Msa2HPG2NMOnjGQPT8fJkigCLcBGAs/s1600/pexels-photo-288477-2.jpeg.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="732" data-original-width="1600" height="292" src="https://3.bp.blogspot.com/-ey9UDH_OiqU/WYC6N3cj4NI/AAAAAAAAOY0/ebdu5vl3X5Msa2HPG2NMOnjGQPT8fJkigCLcBGAs/s640/pexels-photo-288477-2.jpeg.jpg" width="640" /></a></div><div class="separator" style="clear: both; text-align: center;"></div><span style="color: black; float: left; font-family: &quot;times&quot; , serif , &quot;georgia&quot;; font-size: 48px; line-height: 30px; padding-right: 2px; padding-top: 2px;">R</span><br />emember Chuck Noland? The character in the movie Castaway, who has to use the blade of an ice-skate to extract his abscessed tooth, without anesthesia? The scene is painful to watch, yet you can't look away.<br /><br /><div style="text-align: left;"></div>Interviews have this habit of turning up a Chuck Noland - in the interviewee or the interviewer. You willingly agree to subject yourself to the wanton abuse by random strangers who you may have to end up working for or with. Apart from the talented few whom companies are more eager to hire than they are to get hired, most are in less enviable positions.<br /><div class="separator" style="clear: both; text-align: center;"><iframe width="320" height="266" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/WhVUgde_lNY/0.jpg" src="?feature=player_embedded" frameborder="0" allowfullscreen></iframe></div><br />What about interviewers? Not all are cut from the same cloth. But there are at least six types that I think we have all met in our lives, and a seventh one.<br /><h3 style="text-align: left;">1. The Interview As an End In Itself - Hyper-excited newbie</h3>You know this guy. You have been this person, most likely. You have a team now. You expect your team to grow. You have to build a team. You believe that you, and you alone, know what it takes to hire the absolutely best person for the opening you have.<br />You sit down and explain to the harried hiring HR person what the role is, what qualifications you are looking for, why the job is special, why just ordinary programming skills in ordinary programming languages will simply not cut it, why you as the hiring manager are special, and how you will, with the new hire, change the product, the company, and eventually the whole wide world. The HR executive therefore needs to spend every waking minute of her time in the pursuance of this nobler than noble objective. You badger your hiring rep incessantly, by phone, by IM, by email, in person, several times a day, asking for better resumes if you are getting many, and more if you aren't getting enough.<br />You read every single resume you get, several times over. You redline the points you don't like. You redline the points you like. You make notes on the resumes. You still talk to every single candidate. You continue interviewing, never selecting, till the economic climate changes and the vacancy is no longer available.<br />Yes, we all know this person.<br /><h3 style="text-align: left;">2. Knows what he is looking for and knows when he finds it</h3>This person is a somewhat rare commodity. This person does not suffer from buyer's remorse, knows that there is no such thing as a perfect candidate, and that the best he can hope to get is a person who comes off as reasonably intelligent, hard-working, ethical, and is going to be a team player.<br /><br />This person will however also suffer from blind spots. Specifically, two kinds of blindspots. The first is that he will look for and evaluate a person only on those criteria that he can assess best. The second is that he is more likely to hire candidates that are similar to other successful employees in his team, and will probably become less likely to take chances on a different type of a candidate. On the other hand, this manager also knows that conceptual skills are more important to test than specific knowledge of some arcane syntax in a geeky programming language - if you are talking of the world of software for instance.<br />This person is a rare commodity.<br /><h3 style="text-align: left;">3. Hire for Empire</h3>Like our previous type of hiring manager, this hiring manager is also very clear-headed. &nbsp;But, here the interviewer is hiring to add headcount to his team. Grow the empire. More people equates to more perceived power. This person understands three things, and understands them perfectly.<br />First, that if he is slow in hiring, then a hiring freeze may come in, and the headcount may no longer stay open.<br />Second, he (or she) is also unable and equally unwilling to evaluate a candidate, so just about anyone will do.<br />Third, and most importantly, this manager knows that every additional person reporting to him on the organization chart elevates him in importance vis-a-vis his peers, and therefore hiring is a goal noble enough to be pursued in its own right.<br />It's a win-win situation for everyone - except the customers, the company, and the team.<br /><h3 style="text-align: left;">4. I have other work to do. What am I doing here? What is he doing here?</h3>This person has little skin in the game. He has no dog in the fight. Pick your metaphor. He is there to take the interview because of someone's absence, or because in the charade of the interview "<i>process</i>" that exists at many companies, there exists a need to do this interview. The interviewer agrees because it is a tax that needs to be paid. You don't want to be labeled a non-team-player. Who knows when this Scarlet Letter may come to haunt you. So our interviewer sets aside half an hour or more, preferably less, of his time, and comes back wondering where thirty minutes of his life just went. That question remains unanswered.<br /><h3 style="text-align: left;">5. Know-it-all and desperate to show it</h3>This person perceived himself as an overachiever. This is the sort of person who will tell you with casual nonchalance that he had predicted the rise of Google in 1999 &nbsp;- just so you can get to know that he had heard of Google in 1999. This person knows he knows everything that there is to know, that it is his beholden duty to make you know it too, and it is your beholden duty to acknowledge this crushing sacerdotal burden he carries. This is the person who will begin the interview with a smirk, sustain a a wry smile, transform into a frown, and end with an exaggerated sense of self-importance.<br />Do not get fooled.<br />This person is as desperate, if not more, to interview you as you are to do well on the interview. He will in all likelihood end up talking more than the interviewee.<br />In every group in every department of every company there exists at least one such person. The successful companies have no more than one.<br /><h3 style="text-align: left;">6. The rubber-stamp</h3>The boss has decided the person who needs to be hired. The charade needs to be completed. The requisite number of people have to interview the candidate so that HR can dot the "I"s and cross the "T"s. Our interviewer here has to speak with this person. With an air of deference. He will ask all the right questions, but the answers do not matter. You sign off with a heartfelt, "Great talking to you. Thanks a ton for your time. Take care, and we really look forward to working with/for you." No, don't belittle this rubber-stamp. He could be you.<br /><br />These are not mutually exclusive sets. There are overlaps that exist, sometimes in combinations that would warm Stephen King's heart.<br /><br />Oh, what about the seventh type of interviewer? He is the&nbsp;<b>Interviewer as Saboteur.</b>&nbsp; I will talk about him in a separate post.<br /><br /><i>This post appeared on&nbsp;<a href="https://www.linkedin.com/in/abhinavagarwal">LinkedIn</a>&nbsp;on&nbsp;<a href="https://www.linkedin.com/pulse/six-plus-one-types-interviewers-abhinav-agarwal">July 31st, 2017</a>.</i><br /><i>This is an edited version of a post I wrote on April 23rd, 2013.</i><br /><br /><span style="color: #666666; font-size: x-small;">© 2017, Abhinav Agarwal. All rights reserved.</span></div> Abhinav Agarwal tag:blogger.com,1999:blog-13714584.post-911733171752851300 Fri Aug 04 2017 13:29:00 GMT-0400 (EDT) ODTUG Kscope17 Social Media Lounge Interviews http://www.kscope18.odtug.com/p/bl/et/blogaid=740&source=1 ODTUG Kscope17 Social Media Lounge interview videos are now live on the ODTUG YouTube! Read the short recaps and watch the interviews with ODTUG Kscope17 Conference Committee Members, ODTUG Board Members, Oracle Professionals, Oracle ACEs, ACEd, ACE Associates, and ACE Alumni. ODTUG http://www.kscope18.odtug.com/p/bl/et/blogaid=740&source=1 Thu Aug 03 2017 14:07:33 GMT-0400 (EDT) ODTUG Kscope17 Social Media Lounge Interviews http://www.odtug.com/p/bl/et/blogaid=740&source=1 ODTUG Kscope17 Social Media Lounge interview videos are now live on the ODTUG YouTube! Read the short recaps and watch the interviews with ODTUG Kscope17 Conference Committee Members, ODTUG Board Members, Oracle Professionals, Oracle ACEs, ACEd, ACE Associates, and ACE Alumni. ODTUG http://www.odtug.com/p/bl/et/blogaid=740&source=1 Thu Aug 03 2017 14:07:33 GMT-0400 (EDT) Installing Oracle Data Integrator Cloud Service (ODICS) https://medium.com/red-pill-analytics/installing-oracle-data-integrator-cloud-service-odics-315da4c5979a?source=rss----abcc62a8d63e---4 <figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*24HBGU1v0K7Ep-ILNJn8uA.jpeg" /><figcaption>Photo Credit:<a href="https://unsplash.com/search/cloud?photo=RmNAdoJNFJs"> Todd Quackenbush</a></figcaption></figure><h4>Creating the Database and Java Cloud Service, Part 1</h4><p>Oracle Data Integrator Cloud Service is a new-ish cloud service from Oracle that allows us to run ODI as a platform in the Oracle Cloud. There are a lot of moving parts to this cloud service, so the first part of this article will cover creating the Oracle Cloud Database, The Java Cloud Service, and provisioning Oracle Data Integrator.</p><p>The first thing we need to set up if we are manually provisioning ODICS is a cloud database. In this guide we’ll use the Oracle Database Cloud.</p><p><strong>Creating the Database</strong></p><p>We log into to your Oracle Cloud Dashboard, navigate to the database tab, and select the icon next to the cog. The dropdown will give us few options, but we select the “Open Service Console” Option.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*OBXa8t_tlJneE4QV-ZeIOw.png" /></figure><p>If this is our first time creating an Oracle Database it brings us to a welcome page; we select “Go To Console”.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*wEZeOISg5ryD5xjLJTsdew.png" /></figure><p>If we don’t already have services running, we see a blank screen pointing to the “Create Service” Button. So, of course, we select that button:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*XaIJw1_BWRNBtUDzwMxQLg.png" /></figure><p>The service page will now ask for a service name and description for our database. The page will also ask us to choose a metering frequency, software release, software edition and database type. We complete all that information and press the “Next” button.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*m4b82hOE3fSR274GbkaPHQ.png" /></figure><p>We are next asked for an Administration Password. We need at least 8 Characters with one upper and one lower case letter as well as a “Special Character” (apparently exclamation points don’t count as a special character.)</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*yjmXICuOACYOjqqzAIzEOQ.png" /></figure><p>We select a compute shape that meets the requirements of our workload and create a new SSH key. We need to supply a public SSH key for accessing the service securely. We can generate one using a third-party utility, or have the service create one for us.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*-QMdsP7mH51MOrMrtQFUOg.png" /></figure><p>We configure the backup with the cloud storage. <em>(We have to configure a backup in order to create the Java Cloud Service) </em>After we complete the database and backup configuration, we select “Next”.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*s2AFRSyFY0hTjMXzvWOjfA.png" /></figure><p>We review and confirm the settings for the service instance and select “Create” (This will take awhile).</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*RH9XZySkYmWM-gIRWfd-Xw.png" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*KYfi59pMtIxFWCrF3RBPVw.png" /></figure><p>Once the database is finally up and running, we need to enable the “ora_p2_dblistener” network access rule. “ora_p2_dblistener” controls access from the public internet to the <strong>ora_db</strong> security list on the <strong>ora_dblistener</strong> security application<em>.</em> To enable “ora_p2_dblistener”, we navigate to the database console, click on the drop down menu next to the database, and select “<strong>Access Rules</strong>”. We find “ora_p2_dblistener” and select the drop down menu next to it and enable it.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*sDd4IFBP-kHgpOcIBOyL-A.png" /></figure><p><strong>Creating the Java Cloud Service</strong></p><p>Now we navigate to the Oracle dashboard and select the dropdown next to the “Java” tab. We open the service console, select “go to console”, click the “Create Service” button, and select “Java Cloud Service” from the drop-down.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*Xu-gA8GiOv9NnJB9imGgsA.png" /></figure><p>We choose “MyJCS” as the service name, and could provide a description if we wanted to. As before, we need to select a service level, metering frequency, software release and edition. The default service level is set to Oracle Java Cloud Service with a monthly metering frequency. We then click the “Next” button.</p><p>(Note: If we selected the “NONE” option for the backup in the earlier step while creating the database, we get an error stating that there is no database that is set up to support JCS. Oracle requires us to have a backup configured in order to run Java Cloud Service)</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*adhWZ4s4jaImEesznmxcjg.png" /></figure><p>We need to set up the service details. We select the compute shape, provide the ssh that we used for the database, and select “2” for the cluster size. We then create a password for the WebLogic Local Administrator. We then expand the advanced settings and check the “<strong>Enable access to Administration consoles</strong>” box.</p><p>We now select the database that we want to use, and enter the admin username and password, make sure the load balancer is set to yes, and fill out the backup and recovery configuration. We then select “Create Cloud Storage Container”.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*rLVW1qkbfIdMy_CsN7FLvQ.png" /></figure><p>We select next and review our settings; once confirmed we select “Create”.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*wJoC1VKC25RlzQYCO9JfdA.png" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*BQxW22NjqtAtxe8Z5s0Bmw.png" /></figure><p>Once the Java Cloud Server has been created, we need to connect to the Oracle Java Cloud Service Instance through SSH. To do this we find the IP address of the admin server VM hosting the instance. This is located in the Java Cloud Service station. When we click on the instance, we should see the public IP under the Administration server Domain.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*SKf4vkNdk-zMUoHYYoqzQQ.png" /></figure><p>We start a Linux terminal and connect to the VM using SSH:</p><pre>ssh -i path_to_private_key opc@IP_of_JCS_Instance_Admin_Server</pre><pre>ssh -i privatekey opc@140.55.55.555</pre><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*m9vOD-KzjkePiwiD0c97FA.png" /></figure><p>Once on the VM, we switch to user “oracle” by issuing the “sudo” command, and then start the VNC Server:</p><pre>sudo su oracle</pre><pre>vncserver -nolisten local -geometry 1680x1050</pre><p>We are then prompted to create a password.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*ZGiR6ESJ2OrlWEJuEOHo_A.png" /></figure><p>We now open a new local terminal to create an SSH tunnel to the VNC Server port on the Admin Server VM using the following command:</p><pre>ssh -i path_to_private_key -L 5901:IP_of_Tunnel_Server:5901 opc@IP_of_Admin_Server -N</pre><pre>ssh -i privatekey -L 5901:140.86.33.248:5901 opc@140.86.33.248 -N</pre><p>*Use the same IP as the Admin Server for the IP Tunnel Server with the 5901 port*</p><p>After creating the tunnel we now login to the VM with VNC using localhost:5901.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/590/1*hfwZxmmrYtBSlRJsqTfnOQ.png" /></figure><p>If we didn’t do the above steps fast enough we get locked out of the VM. We restart the Java Server and get into the VM before the screensaver turns on (5 Minutes). Once inside the VM we disable the screensaver by going to System &gt; Preference &gt; Screensaver &gt; Uncheck “Lock screen when screensaver is active.”</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/960/1*rZm3AGFjTxZnWxgMF_Qv_Q.png" /></figure><p>Using the terminal that we started the VNC server with, we ensure that we are using the Oracle user by typing the command “whoami.” We need to kill all locked sessions by running the following commands. First, find the locked sessions:</p><pre>ps -ef | grep vnc</pre><p>Then kill them:</p><pre>kill -9 &lt;session #&gt;</pre><p>Then we clean up any temp files associated with those locked sessions:</p><pre>rm -R /tmp/.X*</pre><p>We start the VNC server again logged in as oracle with the following command:</p><pre>vncserver -nolisten local -geometry 1680x1050</pre><p>We repeat the same steps used to open a tunnel, but this time on a different port. We used 5901 before but now we will use 5902 to keep it simple</p><pre>ssh -i privatekey -L 5902:140.86.33.248:5902 opc@140.86.33.248 -N</pre><p>We make sure the screensaver is disabled so we won’t get locked out.</p><p>We now navigate to the the <strong>/u01/zips/upperstack</strong> directory, where there <em>should</em> be an archive file called “ODI.zip”. It didn’t actually exist in our environment, so we downloaded the ODI software manually from OTN and transferred it to our VM. We unzip the ODI installation files and launch the installer:</p><pre>java -jar fmw_12.2.1.2.6_odi_generic.jar</pre><p>When prompted, we set the inventory directory to <strong>/u01/app/oraInventory, </strong>the operating system group to “oracle”, and select “OK” to create the inventory directory.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*xQiJQnwB2P4gtBlmlKgD9g.png" /></figure><p>We choose to <strong>Skip</strong> auto updates, and the next screen, we enter our “Oracle Home” location as “/u01/app/oracle/middleware” and move to the next screen.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*Fi5OWX2lMsvyW_aUOC20vw.png" /></figure><p>We choose the <strong>Enterprise Installation</strong>…</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*pSftjwXPBuiADoW9tOOCoA.png" /></figure><p>…and then wait for the perquisites to complete.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*sb25zntQecXc0k3wa_KJ9Q.png" /></figure><p>On the installation summary page, we select “Install” to begin the installation, and then wait for a successful install.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*vrDihTTwqG-nFyxETCG31g.png" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*DWUIydq4JfRKQhbOyGLDQQ.png" /></figure><p><strong>Congratulations!</strong> We successfully created the Oracle Cloud Database, The Java Cloud Service, and installed Oracle Data Integrator. Part Two will cover<strong> </strong>creating repositories to build the required Oracle Data Integrator schemas and updating the Java Cloud Service Domain to ensure wehave the newest features and enhancements.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=315da4c5979a" width="1" height="1"><hr><p><a href="https://medium.com/red-pill-analytics/installing-oracle-data-integrator-cloud-service-odics-315da4c5979a">Installing Oracle Data Integrator Cloud Service (ODICS)</a> was originally published in <a href="https://medium.com/red-pill-analytics">Red Pill Analytics</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p> Travis Brannan https://medium.com/p/315da4c5979a Wed Aug 02 2017 15:30:27 GMT-0400 (EDT) Installing Oracle Data integrator Cloud Service (ODICS) http://redpillanalytics.com/installingodics/ <p><img width="300" height="174" src="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/08/TravisImage.jpg?fit=300%2C174" class="attachment-medium size-medium wp-post-image" alt="Oracle Data Integrator" srcset="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/08/TravisImage.jpg?w=2000 2000w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/08/TravisImage.jpg?resize=300%2C174 300w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/08/TravisImage.jpg?resize=768%2C446 768w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/08/TravisImage.jpg?resize=1024%2C594 1024w" sizes="(max-width: 300px) 100vw, 300px" data-attachment-id="5324" data-permalink="http://redpillanalytics.com/installingodics/travisimage/" data-orig-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/08/TravisImage.jpg?fit=2000%2C1161" data-orig-size="2000,1161" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="Oracle Data Integrator" data-image-description="&lt;p&gt; Oracle Data Integrator&lt;/p&gt; " data-medium-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/08/TravisImage.jpg?fit=300%2C174" data-large-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/08/TravisImage.jpg?fit=1024%2C594" /></p><p class="graf graf--p">Oracle Data Integrator Cloud Service is a new-ish cloud service from Oracle that allows us to run ODI as a platform in the Oracle Cloud. There are a lot of moving parts to this cloud service, so the first part of this article will cover creating the Oracle Cloud Database, The Java Cloud Service, and provisioning Oracle Data Integrator.</p> <p class="graf graf--p">The first thing we need to set up if we are manually provisioning ODICS is a cloud database. In this guide we’ll use the Oracle Database Cloud.</p> <p class="graf graf--p"><strong class="markup--strong markup--p-strong">Creating the Database</strong></p> <p class="graf graf--p">We log into to your Oracle Cloud Dashboard, navigate to the database tab, and select the icon next to the cog. The dropdown will give us few options, but we select the “Open Service Console” Option.</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i2.wp.com/cdn-images-1.medium.com/max/1600/1*OBXa8t_tlJneE4QV-ZeIOw.png?resize=1170%2C572&#038;ssl=1" data-image-id="1*OBXa8t_tlJneE4QV-ZeIOw.png" data-width="2852" data-height="1394" data-recalc-dims="1" /></figure> <p class="graf graf--p">If this is our first time creating an Oracle Database it brings us to a welcome page; we select “Go To Console”.</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i1.wp.com/cdn-images-1.medium.com/max/1600/1*wEZeOISg5ryD5xjLJTsdew.png?resize=1170%2C589&#038;ssl=1" data-image-id="1*wEZeOISg5ryD5xjLJTsdew.png" data-width="2840" data-height="1430" data-recalc-dims="1" /></figure> <p class="graf graf--p">If we don’t already have services running, we see a blank screen pointing to the “Create Service” Button. So, of course, we select that button:</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i1.wp.com/cdn-images-1.medium.com/max/1600/1*XaIJw1_BWRNBtUDzwMxQLg.png?resize=1170%2C542&#038;ssl=1" data-image-id="1*XaIJw1_BWRNBtUDzwMxQLg.png" data-width="2866" data-height="1328" data-recalc-dims="1" /></figure> <p class="graf graf--p"> The service page will now ask for a service name and description for our database. The page will also ask us to choose a metering frequency, software release, software edition and database type. We complete all that information and press the “Next” button.</p> <figure></figure> <figure class="graf graf--figure"><img class="graf-image" src="https://i1.wp.com/cdn-images-1.medium.com/max/1600/1*m4b82hOE3fSR274GbkaPHQ.png?resize=1170%2C411&#038;ssl=1" data-image-id="1*m4b82hOE3fSR274GbkaPHQ.png" data-width="2846" data-height="1000" data-recalc-dims="1" /></figure> <p>&nbsp;</p> <p class="graf graf--p">We are next asked for an Administration Password. We need at least 8 Characters with one upper and one lower case letter as well as a “Special Character” (apparently exclamation points don’t count as a special character.)</p> <p>&nbsp;</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i0.wp.com/cdn-images-1.medium.com/max/1600/1*yjmXICuOACYOjqqzAIzEOQ.png?resize=1170%2C502&#038;ssl=1" data-image-id="1*yjmXICuOACYOjqqzAIzEOQ.png" data-width="2260" data-height="970" data-recalc-dims="1" /></figure> <p class="graf graf--p">We select a compute shape that meets the requirements of our workload and create a new SSH key. We need to supply a public SSH key for accessing the service securely. We can generate one using a third-party utility, or have the service create one for us.</p> <p>&nbsp;</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i2.wp.com/cdn-images-1.medium.com/max/1600/1*-QMdsP7mH51MOrMrtQFUOg.png?resize=1170%2C646&#038;ssl=1" data-image-id="1*-QMdsP7mH51MOrMrtQFUOg.png" data-width="2388" data-height="1318" data-recalc-dims="1" /></figure> <p>&nbsp;</p> <p class="graf graf--p">We configure the backup with the cloud storage. <em class="markup--em markup--p-em">(We have to configure a backup in order to create the Java Cloud Service) </em>After we complete the database and backup configuration, we select “Next”.</p> <p>&nbsp;</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i1.wp.com/cdn-images-1.medium.com/max/1600/1*s2AFRSyFY0hTjMXzvWOjfA.png?resize=1170%2C498&#038;ssl=1" data-image-id="1*s2AFRSyFY0hTjMXzvWOjfA.png" data-width="2120" data-height="902" data-recalc-dims="1" /></figure> <p>&nbsp;</p> <p class="graf graf--p">We review and confirm the settings for the service instance and select “Create” (This will take awhile).</p> <p>&nbsp;</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i1.wp.com/cdn-images-1.medium.com/max/1600/1*RH9XZySkYmWM-gIRWfd-Xw.png?resize=1170%2C709&#038;ssl=1" data-image-id="1*RH9XZySkYmWM-gIRWfd-Xw.png" data-width="2324" data-height="1408" data-recalc-dims="1" /></figure> <figure class="graf graf--figure"><img class="graf-image" src="https://i2.wp.com/cdn-images-1.medium.com/max/1600/1*KYfi59pMtIxFWCrF3RBPVw.png?resize=1170%2C573&#038;ssl=1" data-image-id="1*KYfi59pMtIxFWCrF3RBPVw.png" data-width="2318" data-height="1136" data-recalc-dims="1" /></figure> <p>&nbsp;</p> <p class="graf graf--p">Once the database is finally up and running, we need to enable the “ora_p2_dblistener” network access rule. “ora_p2_dblistener” controls access from the public internet to the <strong class="markup--strong markup--p-strong">ora_db</strong> security list on the <strong class="markup--strong markup--p-strong">ora_dblistener</strong> security application<em class="markup--em markup--p-em">.</em> To enable “ora_p2_dblistener”, we navigate to the database console, click on the drop down menu next to the database, and select “<strong class="markup--strong markup--p-strong">Access Rules</strong>”. We find “ora_p2_dblistener” and select the drop down menu next to it and enable it.</p> <p>&nbsp;</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i0.wp.com/cdn-images-1.medium.com/max/1600/1*sDd4IFBP-kHgpOcIBOyL-A.png?resize=1170%2C632&#038;ssl=1" data-image-id="1*sDd4IFBP-kHgpOcIBOyL-A.png" data-width="2312" data-height="1248" data-recalc-dims="1" /></figure> <p class="graf graf--p"><strong class="markup--strong markup--p-strong">Creating the Java Cloud Service</strong></p> <p class="graf graf--p">Now we navigate to the Oracle dashboard and select the dropdown next to the “Java” tab. We open the service console, select “go to console”, click the “Create Service” button, and select “Java Cloud Service” from the drop-down.</p> <p>&nbsp;</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i1.wp.com/cdn-images-1.medium.com/max/1600/1*Xu-gA8GiOv9NnJB9imGgsA.png?resize=1170%2C540&#038;ssl=1" data-image-id="1*Xu-gA8GiOv9NnJB9imGgsA.png" data-width="2828" data-height="1306" data-recalc-dims="1" /></figure> <p class="graf graf--p">We choose “MyJCS” as the service name, and could provide a description if we wanted to. As before, we need to select a service level, metering frequency, software release and edition. The default service level is set to Oracle Java Cloud Service with a monthly metering frequency. We then click the “Next” button.</p> <p class="graf graf--p">(Note: If we selected the “NONE” option for the backup in the earlier step while creating the database, we get an error stating that there is no database that is set up to support JCS. Oracle requires us to have a backup configured in order to run Java Cloud Service)</p> <p>&nbsp;</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i0.wp.com/cdn-images-1.medium.com/max/1600/1*adhWZ4s4jaImEesznmxcjg.png?resize=1170%2C396&#038;ssl=1" data-image-id="1*adhWZ4s4jaImEesznmxcjg.png" data-width="2230" data-height="754" data-recalc-dims="1" /></figure> <p class="graf graf--p">We need to set up the service details. We select the compute shape, provide the ssh that we used for the database, and select “2” for the cluster size. We then create a password for the WebLogic Local Administrator. We then expand the advanced settings and check the “<strong class="markup--strong markup--p-strong">Enable access to Administration consoles</strong>” box.</p> <p class="graf graf--p">We now select the database that we want to use, and enter the admin username and password, make sure the load balancer is set to yes, and fill out the backup and recovery configuration. We then select “Create Cloud Storage Container”.</p> <p>&nbsp;</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i0.wp.com/cdn-images-1.medium.com/max/1600/1*rLVW1qkbfIdMy_CsN7FLvQ.png?resize=1170%2C718&#038;ssl=1" data-image-id="1*rLVW1qkbfIdMy_CsN7FLvQ.png" data-width="2216" data-height="1360" data-recalc-dims="1" /></figure> <p class="graf graf--p">We select next and review our settings; once confirmed we select “Create”.</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i0.wp.com/cdn-images-1.medium.com/max/1600/1*wJoC1VKC25RlzQYCO9JfdA.png?resize=1170%2C730&#038;ssl=1" data-image-id="1*wJoC1VKC25RlzQYCO9JfdA.png" data-width="2224" data-height="1388" data-recalc-dims="1" /></figure> <figure class="graf graf--figure"><img class="graf-image" src="https://i0.wp.com/cdn-images-1.medium.com/max/1600/1*BQxW22NjqtAtxe8Z5s0Bmw.png?resize=1170%2C565&#038;ssl=1" data-image-id="1*BQxW22NjqtAtxe8Z5s0Bmw.png" data-width="2302" data-height="1112" data-recalc-dims="1" /></figure> <p>&nbsp;</p> <p class="graf graf--p">Once the Java Cloud Server has been created, we need to connect to the Oracle Java Cloud Service Instance through SSH. To do this we find the IP address of the admin server VM hosting the instance. This is located in the Java Cloud Service station. When we click on the instance, we should see the public IP under the Administration server Domain.</p> <p>&nbsp;</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i1.wp.com/cdn-images-1.medium.com/max/1600/1*SKf4vkNdk-zMUoHYYoqzQQ.png?resize=1170%2C680&#038;ssl=1" data-image-id="1*SKf4vkNdk-zMUoHYYoqzQQ.png" data-width="2200" data-height="1278" data-recalc-dims="1" /></figure> <p>&nbsp;</p> <p class="graf graf--p">We start a Linux terminal and connect to the VM using SSH:</p> <pre class="graf graf--pre">ssh -i path_to_private_key opc@IP_of_JCS_Instance_Admin_Server</pre> <pre class="graf graf--pre">ssh -i privatekey opc@140.55.55.555 </pre> <figure class="graf graf--figure"><img class="graf-image" src="https://i1.wp.com/cdn-images-1.medium.com/max/1600/1*m9vOD-KzjkePiwiD0c97FA.png?resize=1168%2C456&#038;ssl=1" data-image-id="1*m9vOD-KzjkePiwiD0c97FA.png" data-width="1168" data-height="456" data-recalc-dims="1" /></figure> <p>&nbsp;</p> <p class="graf graf--p">Once on the VM, we switch to user “oracle” by issuing the “sudo” command, and then start the VNC Server:</p> <pre class="graf graf--pre">sudo su oracle</pre> <pre class="graf graf--pre">vncserver -nolisten local -geometry 1680x1050</pre> <p class="graf graf--p">We are then prompted to create a password.</p> <p>&nbsp;</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i0.wp.com/cdn-images-1.medium.com/max/1600/1*ZGiR6ESJ2OrlWEJuEOHo_A.png?resize=1134%2C650&#038;ssl=1" data-image-id="1*ZGiR6ESJ2OrlWEJuEOHo_A.png" data-width="1134" data-height="650" data-recalc-dims="1" /></figure> <p>&nbsp;</p> <p class="graf graf--p">We now open a new local terminal to create an SSH tunnel to the VNC Server port on the Admin Server VM using the following command:</p> <pre class="graf graf--pre">ssh -i path_to_private_key -L 5901:IP_of_Tunnel_Server:5901 opc@IP_of_Admin_Server -N</pre> <pre class="graf graf--pre">ssh -i privatekey -L 5901:140.86.33.248:5901 opc@140.86.33.248 -N</pre> <p class="graf graf--p">*Use the same IP as the Admin Server for the IP Tunnel Server with the 5901 port*</p> <p class="graf graf--p">After creating the tunnel we now login to the VM with VNC using localhost:5901.</p> <p>&nbsp;</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i2.wp.com/cdn-images-1.medium.com/max/1600/1*hfwZxmmrYtBSlRJsqTfnOQ.png?resize=590%2C492&#038;ssl=1" data-image-id="1*hfwZxmmrYtBSlRJsqTfnOQ.png" data-width="590" data-height="492" data-recalc-dims="1" /></figure> <p>&nbsp;</p> <p class="graf graf--p">If we didn’t do the above steps fast enough we get locked out of the VM. We restart the Java Server and get into the VM before the screensaver turns on (5 Minutes). Once inside the VM we disable the screensaver by going to System &gt; Preference &gt; Screensaver &gt; Uncheck “Lock screen when screensaver is active.”</p> <p>&nbsp;</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i0.wp.com/cdn-images-1.medium.com/max/1600/1*rZm3AGFjTxZnWxgMF_Qv_Q.png?resize=960%2C788&#038;ssl=1" data-image-id="1*rZm3AGFjTxZnWxgMF_Qv_Q.png" data-width="960" data-height="788" data-recalc-dims="1" /></figure> <p>&nbsp;</p> <p class="graf graf--p">Using the terminal that we started the VNC server with, we ensure that we are using the Oracle user by typing the command “whoami.” We need to kill all locked sessions by running the following commands. First, find the locked sessions:</p> <pre class="graf graf--pre">ps -ef | grep vnc</pre> <p class="graf graf--p">Then kill them:</p> <pre class="graf graf--pre">kill -9 &lt;session #&gt;</pre> <p class="graf graf--p">Then we clean up any temp files associated with those locked sessions:</p> <pre class="graf graf--pre">rm -R /tmp/.X*</pre> <p class="graf graf--p">We start the VNC server again logged in as oracle with the following command:</p> <pre class="graf graf--pre">vncserver -nolisten local -geometry 1680x1050</pre> <p class="graf graf--p">We repeat the same steps used to open a tunnel, but this time on a different port. We used 5901 before but now we will use 5902 to keep it simple</p> <pre class="graf graf--pre">ssh -i privatekey -L 5902:140.86.33.248:5902 opc@140.86.33.248 -N</pre> <p class="graf graf--p">We make sure the screensaver is disabled so we won’t get locked out.</p> <p class="graf graf--p">We now navigate to the the <strong class="markup--strong markup--p-strong">/u01/zips/upperstack</strong> directory, where there <em class="markup--em markup--p-em">should</em> be an archive file called “ODI.zip”. It didn’t actually exist in our environment, so we downloaded the ODI software manually from OTN and transferred it to our VM. We unzip the ODI installation files and launch the installer:</p> <pre class="graf graf--pre">java -jar fmw_12.2.1.2.6_odi_generic.jar</pre> <p class="graf graf--p">When prompted, we set the inventory directory to <strong class="markup--strong markup--p-strong">/u01/app/oraInventory, </strong>the operating system group to “oracle”, and select “OK” to create the inventory directory.</p> <p>&nbsp;</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i1.wp.com/cdn-images-1.medium.com/max/1600/1*xQiJQnwB2P4gtBlmlKgD9g.png?resize=1170%2C689&#038;ssl=1" data-image-id="1*xQiJQnwB2P4gtBlmlKgD9g.png" data-width="1202" data-height="708" data-recalc-dims="1" /></figure> <p>&nbsp;</p> <p class="graf graf--p">We choose to <strong class="markup--strong markup--p-strong">Skip</strong> auto updates, and the next screen, we enter our “Oracle Home” location as “/u01/app/oracle/middleware” and move to the next screen.</p> <p>&nbsp;</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i2.wp.com/cdn-images-1.medium.com/max/1600/1*Fi5OWX2lMsvyW_aUOC20vw.png?resize=1170%2C939&#038;ssl=1" data-image-id="1*Fi5OWX2lMsvyW_aUOC20vw.png" data-width="1218" data-height="978" data-recalc-dims="1" /></figure> <p>&nbsp;</p> <p class="graf graf--p">We choose the <strong class="markup--strong markup--p-strong">Enterprise Installation</strong>…</p> <p>&nbsp;</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i0.wp.com/cdn-images-1.medium.com/max/1600/1*pSftjwXPBuiADoW9tOOCoA.png?resize=1170%2C950&#038;ssl=1" data-image-id="1*pSftjwXPBuiADoW9tOOCoA.png" data-width="1200" data-height="974" data-recalc-dims="1" /></figure> <p>&nbsp;</p> <p class="graf graf--p">…and then wait for the perquisites to complete.</p> <p>&nbsp;</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i0.wp.com/cdn-images-1.medium.com/max/1600/1*sb25zntQecXc0k3wa_KJ9Q.png?resize=1170%2C959&#038;ssl=1" data-image-id="1*sb25zntQecXc0k3wa_KJ9Q.png" data-width="1206" data-height="988" data-recalc-dims="1" /></figure> <p>&nbsp;</p> <p class="graf graf--p">On the installation summary page, we select “Install” to begin the installation, and then wait for a successful install.</p> <p>&nbsp;</p> <figure class="graf graf--figure"><img class="graf-image" src="https://i0.wp.com/cdn-images-1.medium.com/max/1600/1*vrDihTTwqG-nFyxETCG31g.png?resize=1170%2C957&#038;ssl=1" data-image-id="1*vrDihTTwqG-nFyxETCG31g.png" data-width="1208" data-height="988" data-recalc-dims="1" /></figure> <figure class="graf graf--figure"><img class="graf-image" src="https://i1.wp.com/cdn-images-1.medium.com/max/1600/1*DWUIydq4JfRKQhbOyGLDQQ.png?resize=1170%2C950&#038;ssl=1" data-image-id="1*DWUIydq4JfRKQhbOyGLDQQ.png" data-width="1212" data-height="984" data-recalc-dims="1" /></figure> <p>&nbsp;</p> <p class="graf graf--p"><strong class="markup--strong markup--p-strong">Congratulations!</strong> We successfully created the Oracle Cloud Database, The Java Cloud Service, and installed Oracle Data Integrator. Part Two will cover creating repositories to build the required Oracle Data Integrator schemas and updating the Java Cloud Service Domain to ensure wehave the newest features and enhancements.</p> Travis Brannan http://redpillanalytics.com/?p=5300 Wed Aug 02 2017 15:30:27 GMT-0400 (EDT) Secure Dashboard Access in Oracle BI Cloud Service (BICS): Sales and Marketing Dashboard Example http://blog.performancearchitects.com/wp/2017/08/02/secure-dashboard-access-in-oracle-bi-cloud-service-bics-sales-and-marketing-dashboard-example/ <p><strong> </strong>Author: Linda Stewart, Performance Architects</p> <p><a href="https://cloud.oracle.com/business_intelligence">Oracle Business Intelligence Cloud Service (BICS)</a> is part of Oracle’s Platform as Service (PaaS) offerings.  The <a href="http://www.oracle.com">Oracle</a> cloud offerings permit enterprise IT teams to rapidly build and deploy applications without the need to set up expensive infrastructure.  There’s still a need, however, for strong security capabilities in the cloud, and this blog post discusses how BICS security works and how to set up secure dashboard access in the solution via a sales and marketing dashboard sample business case.</p> <p><strong>How Oracle Cloud Security Works</strong></p> <p>When your business signs up for an <a href="https://cloud.oracle.com/home">Oracle Cloud</a> account, Oracle Cloud creates an identity domain specific to your company. As users log in to an Oracle Cloud service, Oracle cloud identity management controls the user authentication and the features of the service users can access using <a href="http://www.oracle.com/us/products/middleware/identity-management/oracle-enterprise-sso/overview/index.html">Oracle Enterprise Single Sign-On (SSO)</a>.  SSO may be federated between on-premise and cloud SSO.  Oracle Cloud uses LDAP schemas for storing the identities.</p> <p><strong>User and Role Management Overview</strong></p> <p>BICS Security is comprised of two items:  1. Oracle Cloud Identity Domain Users and Roles and 2. Oracle BICS Application Users and Roles.</p> <p>First, we add the user to Oracle Cloud Identity Management using the Identity Management Administrator credentials.  Click “Users,” then on the following form, click “Add&#8221;:</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda1.jpg"><img class="alignnone size-medium wp-image-2090" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda1-300x205.jpg" alt="" width="300" height="205" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda1-300x205.jpg 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda1-768x525.jpg 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda1-624x427.jpg 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda1.jpg 955w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p>Typically, we use the email address as the user name.</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda2.jpg"><img class="alignnone size-medium wp-image-2089" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda2-300x181.jpg" alt="" width="300" height="181" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda2-300x181.jpg 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda2-768x463.jpg 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda2-624x376.jpg 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda2.jpg 1019w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p>A few things have changed in this form.  We can now set a role in the Cloud Service for each user.  If the new user just views reports, we will click on the “Service” drop down, select “BICS (Business Intelligence),” and then click the button with the two “greater than” signs. This pushes the roles to the “Selected Roles” box.  If the user is <em>not</em> a reports or dashboard author, then uncheck: “bics BI Cloud Service Advanced Content Authors” as shown above.</p> <p>Click “Add<strong>”</strong> to save the user.</p> <p>Next, open BICS as an administrator to complete user creation.  Open “Console” and select “Users” and “Roles.”</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda3.jpg"><img class="alignnone size-medium wp-image-2088" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda3-300x189.jpg" alt="" width="300" height="189" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda3-300x189.jpg 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda3-768x484.jpg 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda3-624x393.jpg 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda3.jpg 922w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p>In BICS, there are five predefined application roles.  We do not have to add our new user to any of the predefined application roles.</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda4.jpg"><img class="alignnone wp-image-2087 size-medium" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda4-300x188.jpg" alt="" width="300" height="188" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda4-300x188.jpg 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda4.jpg 568w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p><strong>Sales and Marketing Dashboard Example</strong></p> <p>Let’s say we want to secure our data by sales region.  We have a global sales manager who should be able to see all sales regions.  In our fact table, we have the name of the sales region on each data row.  We want to secure the dashboard to only the members of the sales department, and then to filter the data where the salesperson may only see their own regional data and the sales manager should be able to see all regions.  Let’s also say there are marketing departmental dashboards in BICS and marketing should not be able to see the sales dashboard and sales should not be able to see the marketing dashboard.</p> <p>To do this, we can establish two application roles to manage the dashboard level.  Add “Sales Dashboard.” Click “Save.”  Repeat for the marketing dashboard.</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda5.jpg"><img class="alignnone size-medium wp-image-2086" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda5-300x200.jpg" alt="" width="300" height="200" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda5-300x200.jpg 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda5-768x511.jpg 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda5-1024x681.jpg 1024w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda5-624x415.jpg 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda5.jpg 1039w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p>Add members (users) to “Sales” by selecting the button at the right end of the “Sales Application Role” and then by selecting “Manage Members:”</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda6.jpg"><img class="alignnone size-full wp-image-2085" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda6.jpg" alt="" width="191" height="107" /></a></p> <p>Search for “John Doe” and add this user to the “Sales Dashboard Role” by clicking the user name in the left box and then by clicking the single arrow to place the user in the “Selected Users” panel.  Click “OK” to save.</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda7.jpg"><img class="alignnone size-medium wp-image-2084" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda7-300x273.jpg" alt="" width="300" height="273" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda7-300x273.jpg 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda7.jpg 568w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p>Create the marketing dashboard role and add users to that role using the same process.</p> <p>Next, we will create roles to secure the data in the “Sales” fact. Open “Application Role” and add the following roles:</p> <ul> <li>Global Sales Manager Data</li> <li>Eastern Sales Data</li> <li>Western Sales Data</li> <li>Marketing Data</li> </ul> <p>Use the same process as creating the “Sales Application Role,” and add a member to each role.</p> <p>We can divide the sales regions as many ways as we want as long as the sales region data in our fact has data to support our scenario to slice up the data.  To simplify the example, we will establish two sales regions and then a sales manager role (who sees all sales regions).</p> <p>Once the application roles are complete, we can navigate to the “BICS Modeler” to secure the data, and then to the “BICS Catalog” to secure the dashboards.</p> <p>In the Modeler, we will need administrator rights so we can filter the data by role:</p> <ul> <li>Open “Data Model” in the left panel</li> <li>Select “Fact-Sales”</li> <li>Select “Lock to Edit” button</li> <li>Select “Data Filters” tab</li> <li>Click “Add”</li> <li>Select “Role” as “Eastern Sales Data”</li> <li>Click the “FX” button</li> <li>Select to filter the fact data on Region=’eastern’</li> </ul> <p>Repeat for the Western region.  This secures our data.</p> <p>Lastly, as demonstrated in the image below, we will secure the dashboards by opening the Catalog:</p> <ul> <li>Under the “Company Shared Folder,” we established “Sales” and “Marketing” folders</li> <li>Add both the sales and marketing dashboard roles to each folder</li> <li>Apply permissions recursively</li> <li>Select the “Sales” folder</li> <li>In the “Tasks” pane, select “Permissions”</li> <li>Click the ‘+’ button to add new application roles<br /> Select the sales and marketing dashboard roles</li> <li>Set as custom permissions</li> </ul> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda8.jpg"><img class="alignnone size-medium wp-image-2083" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda8-300x257.jpg" alt="" width="300" height="257" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda8-300x257.jpg 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda8.jpg 561w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p>Use the pencil icon on the “Marketing” role and uncheck all of the boxes to change the permissions to “No Access.”</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda9.jpg"><img class="alignnone size-medium wp-image-2082" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda9-300x202.jpg" alt="" width="300" height="202" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda9-300x202.jpg 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda9-768x516.jpg 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda9-624x419.jpg 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/07/linda9.jpg 967w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p>Repeat the process for the “Marketing” dashboard, but set the “Marketing” role to “Full Control” and the “Sales” role to “No Access.”</p> Melanie Mathews http://blog.performancearchitects.com/wp/?p=2081 Wed Aug 02 2017 05:34:50 GMT-0400 (EDT) Checkmate for OBI is Now Free https://medium.com/red-pill-analytics/checkmate-for-obi-free-ec48e30e0787?source=rss----abcc62a8d63e---4 <figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*cUhN8XiZJv8b6yIOZ2qgfg.jpeg" /></figure><p>For those of you in the Oracle Business Intelligence space, you have likely heard us or some one else talk about <a href="http://redpillanalytics.com/checkmate/">Checkmate</a> in the past. We have several posts on our blog about Checkmate, as well as a <a href="https://www.youtube.com/playlist?list=PL09-gIRXkzOVRxuXw7Sn0KUQLqdvBzV8g">YouTube playlist</a> dedicated to it, so I won’t spend too much time in this article evangelizing it. My purpose here is to clarify our new <a href="https://en.wikipedia.org/wiki/Freemium">freemium</a> model for Checkmate and talk through why we made this decision.</p><h3>Free as in Beer</h3><p>To distinguish the sometimes radically different meanings of the word “free” in the English language generically and open-source software specifically, long-time open-source advocate <a href="http://wiki.c2.com/?RichardStallman">Richard Stallman</a> coined the phrase <a href="http://wiki.c2.com/?FreeAsInBeer">“Free as in Beer”</a> to drive the point home. We haven’t open-sourced Checkmate, mostly due to it’s reliance on closed-source software from Oracle, either through Oracle Business Intelligence or Oracle Analytics Cloud. But we are making it completely free to use… free as in beer.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*GaWdeP1dR9_jzDn-kvrG0g.jpeg" /><figcaption>Sweetwater (in Atlanta) paid off a Superbowl bet with Sam Adams (in Boston) by producing “Patriot” for a limited release. Atlantans always pays their debts.</figcaption></figure><p>We really mean this. We are actively working with other Oracle BI Partners to assist them with their own implementations, and are <a href="https://github.com/RedPillAnalytics/checkmate-examples">developing public material</a> to make it easier to implement. So why did we make this decision?</p><p>Checkmate has been a lucrative business for Red Pill Analytics. We have multiple Fortune 500 customers that run their entire development process using our product, and in the past we’ve charged an annual license for this. But we’ve also learned a few things during our tenure selling Checkmate, and we thought it was time to reboot our marketing and sales approach. You know… put everything on the table. Consider all options. #ChallengeEverything.</p><p>Here is what we’ve learned:</p><h4>Long Sales Cycle</h4><p>It usually took a long time to sell Checkmate. Really long. This involved lots of demos, proofs-of-concept, question-and-answer sessions, etc. It was all very involved. And frankly… the price we were charging for Checkmate often didn’t justify that kind of investment from us. We just needed to get the product in users’ hands, as easily and efficiently as we could, and let it speak for itself.</p><h4>We Believe this is the Only Way to do OBIEE Development</h4><p>We wrote Checkmate because the OBIEE development lifecycle is broken. That’s as delicately as I can say it. BI and Analytics development is <em>real</em> development, and the lifecycle needs to represent that fact. Source control is a necessity. Either <a href="https://www.atlassian.com/git/tutorials/comparing-workflows#gitflow-workflow">GitFlow</a> or <a href="https://guides.github.com/introduction/flow/">GitHub Flow</a> is a necessity. Equal support for metadata and the catalog is a necessity. Regression testing is a necessity. Automated deployment is a necessity. Checkmate is the only solution on the market that addresses all of these things, and we want it to be the status quo for OBIEE development. Because it’s the right thing to do.</p><h4>We Want to Use Checkmate with Our Own Customers</h4><p>This is the primary reason we made the decision we did. Since the day we hung out our shingle, Red Pill Analytics has offered <a href="http://redpillanalytics.com/capacity-analytics/">Capacity Analytics</a>, and this is still the primary way we engage with most of our customers. We have always offered Checkmate Cloud free of charge to our Capacity Analytics customers, but this simply hasn’t moved the meter when those customers are firmly planted on-prem. Of course, we could offer Checkmate for OBI on-prem in those situations as well, but our customers were often too concerned with posterity to pull the trigger. “What happens if the engagement ends? Will we then need to purchase a license for Checkmate separately?” These eventualities made it difficult for us to use Checkmate for our own customer projects, and we lost all the productivity gains that our consultants were used to.</p><h3>Introducing Checkmate for OBI… Free!</h3><p>We are all working hard to make the transition to free Checkmate for OBI. First, we’ve structured a subscription model for those organizations that want more stability and support with their OBIEE development process. We are converting our licensed customers to an equivalent subscription, and those customers can then choose whether to renew that subscription at the same interval with which they have renewed their software license. We believe we still have a lot to offer organizations using Checkmate, and our subscriptions can be purchased if and when the time is right.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*LI0ZKXw9ib4T4q9ZIfh9gg.png" /><figcaption>See the Red Pill Analytics website for more information.</figcaption></figure><p>When looking at subscription offerings, you’ll notice the breakdown between Studio and Build Framework. Studio is simply the user-facing client that developers use as part of the process of building content in OBIEE. It facilitates the simultaneous development of both metadata and catalog content in a single feature using either the GitFlow or GitHub Flow process. The Build Framework is the Continuous Delivery and DevOps aspect of Checkmate for OBI that enables automated regression testing, automated deployment, and artifact publishing and retrieval. Additionally, we offer the Managed Build subscription, which offers everything from both the Studio and the Build Framework subscriptions, but also includes our Checkmate Analytics framework (look for another post on this in the future), and additional services from us for offloading aspects of the lifecycle.</p><p>I’ll be writing more about Checkmate in the coming weeks, so keep your eyes on the blog for more content. And as always… if you have any questions, please email me at stewart@redpillanalytics.com.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=ec48e30e0787" width="1" height="1"><hr><p><a href="https://medium.com/red-pill-analytics/checkmate-for-obi-free-ec48e30e0787">Checkmate for OBI is Now Free</a> was originally published in <a href="https://medium.com/red-pill-analytics">Red Pill Analytics</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p> Stewart Bryson https://medium.com/p/ec48e30e0787 Tue Aug 01 2017 13:44:50 GMT-0400 (EDT) Checkmate for OBI is Now Free http://redpillanalytics.com/checkmate-for-obi-is-now-free/ <p><img width="300" height="200" src="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/08/william-white-37151.jpg?fit=300%2C200" class="attachment-medium size-medium wp-post-image" alt="Free Checkmate" srcset="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/08/william-white-37151.jpg?w=1920 1920w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/08/william-white-37151.jpg?resize=300%2C200 300w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/08/william-white-37151.jpg?resize=768%2C512 768w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/08/william-white-37151.jpg?resize=1024%2C683 1024w" sizes="(max-width: 300px) 100vw, 300px" data-attachment-id="5311" data-permalink="http://redpillanalytics.com/checkmate-for-obi-is-now-free/william-white-37151/" data-orig-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/08/william-white-37151.jpg?fit=1920%2C1280" data-orig-size="1920,1280" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="Free Checkmate" data-image-description="&lt;p&gt;Free Checkmate&lt;/p&gt; " data-medium-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/08/william-white-37151.jpg?fit=300%2C200" data-large-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/08/william-white-37151.jpg?fit=1024%2C683" /></p><p class="graf graf--p">For those of you in the Oracle Business Intelligence space, you have likely heard us or some one else talk about <a class="markup--anchor markup--p-anchor" href="http://redpillanalytics.com/checkmate/" target="_blank" rel="noopener" data-href="http://redpillanalytics.com/checkmate/">Checkmate</a> in the past. We have several posts on our blog about Checkmate, as well as a <a class="markup--anchor markup--p-anchor" href="https://www.youtube.com/playlist?list=PL09-gIRXkzOVRxuXw7Sn0KUQLqdvBzV8g" target="_blank" rel="noopener" data-href="https://www.youtube.com/playlist?list=PL09-gIRXkzOVRxuXw7Sn0KUQLqdvBzV8g">YouTube playlist</a> dedicated to it, so I won’t spend too much time in this article evangelizing it. My purpose here is to clarify our new <a class="markup--anchor markup--p-anchor" href="https://en.wikipedia.org/wiki/Freemium" target="_blank" rel="noopener" data-href="https://en.wikipedia.org/wiki/Freemium">freemium</a> model for Checkmate and talk through why we made this decision.</p> <h3 class="graf graf--h3">Free as in Beer</h3> <p class="graf graf--p">To distinguish the sometimes radically different meanings of the word “free” in the English language generically and open-source software specifically, long-time open-source advocate <a class="markup--anchor markup--p-anchor" href="http://wiki.c2.com/?RichardStallman" target="_blank" rel="noopener" data-href="http://wiki.c2.com/?RichardStallman">Richard Stallman</a> coined the phrase <a class="markup--anchor markup--p-anchor" href="http://wiki.c2.com/?FreeAsInBeer" target="_blank" rel="noopener" data-href="http://wiki.c2.com/?FreeAsInBeer">“Free as in Beer”</a> to drive the point home. We haven’t open-sourced Checkmate, mostly due to it’s reliance on closed-source software from Oracle, either through Oracle Business Intelligence or Oracle Analytics Cloud. But we are making it completely free to use… free as in beer.</p> <figure class="graf graf--figure"> <p><div style="width: 1210px" class="wp-caption aligncenter"><img class="graf-image" src="https://i0.wp.com/cdn-images-1.medium.com/max/1600/1*GaWdeP1dR9_jzDn-kvrG0g.jpeg?resize=1170%2C683&#038;ssl=1" alt="" data-image-id="1*GaWdeP1dR9_jzDn-kvrG0g.jpeg" data-width="1200" data-height="700" data-recalc-dims="1" /><p class="wp-caption-text">Sweetwater (in Atlanta) paid off a Superbowl bet with Sam Adams (in Boston) by producing “Patriot” for a limited release. Atlantans always pays their debts.</p></div><figcaption class="imageCaption"></figcaption></figure> <p class="graf graf--p">We really mean this. We are actively working with other Oracle BI Partners to assist them with their own implementations, and are <a class="markup--anchor markup--p-anchor" href="https://github.com/RedPillAnalytics/checkmate-examples" target="_blank" rel="noopener" data-href="https://github.com/RedPillAnalytics/checkmate-examples">developing public material</a> to make it easier to implement. So why did we make this decision?</p> <p class="graf graf--p">Checkmate has been a lucrative business for Red Pill Analytics. We have multiple Fortune 500 customers that run their entire development process using our product, and in the past we’ve charged an annual license for this. But we’ve also learned a few things during our tenure selling Checkmate, and we thought it was time to reboot our marketing and sales approach. You know… put everything on the table. Consider all options. #ChallengeEverything.</p> <p class="graf graf--p">Here is what we’ve learned:</p> <h4 class="graf graf--h4">Long Sales Cycle</h4> <p class="graf graf--p">It usually took a long time to sell Checkmate. Really long. This involved lots of demos, proofs-of-concept, question-and-answer sessions, etc. It was all very involved. And frankly… the price we were charging for Checkmate often didn’t justify that kind of investment from us. We just needed to get the product in users’ hands, as easily and efficiently as we could, and let it speak for itself.</p> <h4 class="graf graf--h4">We Believe this is the Only Way to do OBIEE Development</h4> <p class="graf graf--p">We wrote Checkmate because the OBIEE development lifecycle is broken. That’s as delicately as I can say it. BI and Analytics development is <em class="markup--em markup--p-em">real</em> development, and the lifecycle needs to represent that fact. Source control is a necessity. Either <a class="markup--anchor markup--p-anchor" href="https://www.atlassian.com/git/tutorials/comparing-workflows#gitflow-workflow" target="_blank" rel="noopener" data-href="https://www.atlassian.com/git/tutorials/comparing-workflows#gitflow-workflow">GitFlow</a> or <a class="markup--anchor markup--p-anchor" href="https://guides.github.com/introduction/flow/" target="_blank" rel="noopener" data-href="https://guides.github.com/introduction/flow/">GitHub Flow</a> is a necessity. Equal support for metadata and the catalog is a necessity. Regression testing is a necessity. Automated deployment is a necessity. Checkmate is the only solution on the market that addresses all of these things, and we want it to be the status quo for OBIEE development. Because it’s the right thing to do.</p> <h4 class="graf graf--h4">We Want to Use Checkmate with Our Own Customers</h4> <p class="graf graf--p">This is the primary reason we made the decision we did. Since the day we hung out our shingle, Red Pill Analytics has offered <a class="markup--anchor markup--p-anchor" href="http://redpillanalytics.com/capacity-analytics/" target="_blank" rel="noopener" data-href="http://redpillanalytics.com/capacity-analytics/">Capacity Analytics</a>, and this is still the primary way we engage with most of our customers. We have always offered Checkmate Cloud free of charge to our Capacity Analytics customers, but this simply hasn’t moved the meter when those customers are firmly planted on-prem. Of course, we could offer Checkmate for OBI on-prem in those situations as well, but our customers were often too concerned with posterity to pull the trigger. “What happens if the engagement ends? Will we then need to purchase a license for Checkmate separately?” These eventualities made it difficult for us to use Checkmate for our own customer projects, and we lost all the productivity gains that our consultants were used to.</p> <h3 class="graf graf--h3">Introducing Checkmate for OBI… Free!</h3> <p class="graf graf--p">We are all working hard to make the transition to free Checkmate for OBI. First, we’ve structured a subscription model for those organizations that want more stability and support with their OBIEE development process. We are converting our licensed customers to an equivalent subscription, and those customers can then choose whether to renew that subscription at the same interval with which they have renewed their software license. We believe we still have a lot to offer organizations using Checkmate, and our subscriptions can be purchased if and when the time is right.</p> <p>&nbsp;</p> <figure class="graf graf--figure graf--layoutOutsetCenter"> <p><div style="width: 2010px" class="wp-caption aligncenter"><img class="graf-image" src="https://i0.wp.com/cdn-images-1.medium.com/max/2000/1*LI0ZKXw9ib4T4q9ZIfh9gg.png?resize=1170%2C779&#038;ssl=1" alt="" data-image-id="1*LI0ZKXw9ib4T4q9ZIfh9gg.png" data-width="2452" data-height="1632" data-recalc-dims="1" /><p class="wp-caption-text">See the Red Pill Analytics website for more information.</p></div></figure> <p class="graf graf--p">When looking at subscription offerings, you’ll notice the breakdown between Studio and Build Framework. Studio is simply the user-facing client that developers use as part of the process of building content in OBIEE. It facilitates the simultaneous development of both metadata and catalog content in a single feature using either the GitFlow or GitHub Flow process. The Build Framework is the Continuous Delivery and DevOps aspect of Checkmate for OBI that enables automated regression testing, automated deployment, and artifact publishing and retrieval. Additionally, we offer the Managed Build subscription, which offers everything from both the Studio and the Build Framework subscriptions, but also includes our Checkmate Analytics framework (look for another post on this in the future), and additional services from us for offloading aspects of the lifecycle.</p> <p class="graf graf--p">I’ll be writing more about Checkmate in the coming weeks, so keep your eyes on the blog for more content. And as always… if you have any questions, please email me at stewart@redpillanalytics.com.</p> Stewart Bryson http://redpillanalytics.com/?p=5302 Tue Aug 01 2017 13:44:36 GMT-0400 (EDT) Part 5 - The right to be forgotten (EU GDPR)s http://www.oralytics.com/2017/07/part-5-right-to-be-forgotten-eu-gdprs.html <p>This is the fifth part of series of blog posts on '<a href="http://www.oralytics.com/2017/06/how-eu-gdpr-will-affect-use-of-machine.html">How the EU GDPR will affect the use of Machine Learning</a>'</p> <p>Article 17 is titled Right of Erasure (right to be forgotten) allows a person to obtain their data and for the data controller to ensure that the personal data is erased without any any delay.</p> <p>This does not mean that their data can be flagged for non-contact, as I've seen done in many companies, only for the odd time when one of these people have been contacted.</p> <p>It will also allow for people to choose to not take part in data profiling. Meaning that these people cannot be included in any of the input data sets. And should not be the scenario where they are included but they are flagged as not to be contacted in any post ML process where the consumers are contacted, just like I've seen in lots of places.</p> <p><img src="https://lh3.googleusercontent.com/-EZH1yWuHPqQ/WVZxD3C8PNI/AAAAAAAAMOY/mg59ggeGwSQKq9kcaNDGxkNfMvD8DqclwCHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="554" height="162" /></p> <br><p>Click back to '<a href="http://www.oralytics.com/2017/06/how-eu-gdpr-will-affect-use-of-machine.html">How the EU GDPR will affect the use of Machine Learning - Part 1</a>' for links to all the blog posts in this series.</p> Brendan Tierney tag:blogger.com,1999:blog-4669933501315263808.post-3328953025375080845 Mon Jul 31 2017 11:03:00 GMT-0400 (EDT) How to Use Data Flows to Pre-calculate Data (Ep 052) https://www.youtube.com/watch?v=NTtAmnGIAWQ Red Pill Analytics yt:video:NTtAmnGIAWQ Mon Jul 31 2017 09:13:28 GMT-0400 (EDT) Important Design Considerations When Moving from Oracle Essbase On-Premise to Oracle Analytics Cloud (OAC) http://blog.performancearchitects.com/wp/2017/07/26/important-design-considerations-when-moving-from-oracle-essbase-on-premise-to-oracle-analytics-cloud-oac/ <p><strong> </strong>Author: Andy Tauro, Performance Architects</p> <p>We have witnessed a steady stream of commentary on the promise of what <a href="https://cloud.oracle.com/en_US/oachttps:/cloud.oracle.com/en_US/oac">Oracle Analytics Cloud (OAC)</a>, otherwise known as <a href="http://www.oracle.com/technetwork/middleware/essbase/overview/index.html">Essbase</a>, <a href="https://cloud.oracle.com/business_intelligence">BI Cloud Service</a>, and <a href="https://www.oracle.com/solutions/business-analytics/data-visualization.html">Data Visualization</a> (DV) in the cloud, would do when it is released, as well as where to buy a subscription. Now that OAC is generally available as a <a href="https://en.wikipedia.org/wiki/Platform_as_a_service">Platform as a Service (PaaS)</a> offering, we decided to take it through its paces and see what it can really do for those of you who own Essbase on-premise:</p> <ul> <li><strong>Hybrid BSO is enabled. </strong>Hybrid BSO brings the most-loved functionality of ASO to BSO cubes, and in OAC is turned on by default. This means reduced calculation overhead and greatly reduced disk space usage (= reduced subscription cost!) for most solutions out there, and those that can be built.</li> </ul> <ul> <li><strong>Implied Sharing went away</strong>. This feature has been painful for Essbase development veterans. While initially implemented with good intentions, it has caused more trouble than it has helped. It has often been a trap to avoid with careful design practices. In OAC, all stored members in Essbase will hold data, so this issue is resolved!</li> </ul> <ul> <li><strong>Dedicated Lifecycle Management (Essbase LCM Utility) works.</strong> The tool to extract Essbase cubes from on-premise installations works with 11.1.2.4.X (and later) versions of Essbase…and it works beautifully, with no changes needed to existing cubes except one (included in the next point here).</li> </ul> <ul> <li><strong>All cubes are Unicode only.</strong> While Essbase has had Unicode capabilities forever, due to several reasons folks have preferred to stay non-Unicode. Well, no more. The Unicode version of Essbase has matured enough that it is the only option now. So, when migrating cubes to OAC using the Essbase LCM Utility, first convert them to Unicode. Otherwise, they will need to be rebuilt using other means, such as Application Workbooks.</li> </ul> <p>While there are a few more technical differences between the on-premise and cloud versions of Essbase, they are mainly details, and do not necessarily impact the overall solution as much as the points above. Stay tuned for more details on how Essbase works in OAC, or send us a note at <a href="mailto:sales@performancearchitects.com">sales@performancearchitects.com</a> if you want more details on how OAC could work for you.</p> Melanie Mathews http://blog.performancearchitects.com/wp/?p=2079 Wed Jul 26 2017 05:28:17 GMT-0400 (EDT) Management Mantra for Startups - Waste Not, Vacate Not http://bi.abhinavagarwal.net/2017/07/management-mantra-for-startups-waste.html <div dir="ltr" style="text-align: left;" trbidi="on"><h2 style="text-align: left;"></h2><h2 style="text-align: left;"><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-OmKNB3u1cqI/WXRd0oTZlRI/AAAAAAAAOUU/VJsQ-dgXPqofiG5naA0syjU1isMw--0RwCLcBGAs/s1600/train-wreck-steam-locomotive-locomotive-railway-73821-2.jpeg.jpg" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="1138" data-original-width="1600" height="452" src="https://2.bp.blogspot.com/-OmKNB3u1cqI/WXRd0oTZlRI/AAAAAAAAOUU/VJsQ-dgXPqofiG5naA0syjU1isMw--0RwCLcBGAs/s640/train-wreck-steam-locomotive-locomotive-railway-73821-2.jpeg.jpg" width="640" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;"><i>Image credit: pexels.com</i></td></tr></tbody></table><div style="text-align: left;"><h2 style="text-align: left;"><b>Waste Not, Vacate Not.</b></h2></div></h2><div><span style="color: black; float: left; font-family: &quot;times&quot; , serif , &quot;georgia&quot;; font-size: 48px; line-height: 30px; padding-right: 2px; padding-top: 2px;">W</span><br />hen <b>Jeff Bezos</b>, founder and CEO of Amazon, started out Amazon, he, along with Shel Kaphan, programmer and a founding employee, used sixty-dollar doors from Home Depot as desks. It was the demand of frugality. More than a decade later, when Amazon was a multi-billion dollar behemoth, conference-room tables were <i>still</i> made of door-desks. It reflected its CEO's adamant belief in "<i>frugality</i>." A leadership principle at Amazon states that "<i><b>Frugality breeds resourcefulness, self-sufficiency and invention.</b></i>" In case you have been living in a world without news, you would know that Amazon's market capitalization, as of July 23rd, was a shade under US$500 billion, its trailing twelve-month revenues in excess of US$140 billion, and has been growing at an annual rate of more than 20%.<br /><br />All this about Amazon's culture of frugality are captured in Brad Stone's brilliant book on the company, "<b>The Everything Store:&nbsp;Jeff Bezos and the Age of Amazon.</b>"<br /><blockquote class="tr_bq">"Bezos met me in an eighth-floor conference room and we sat down at a large table made of half a dozen door-desks, the same kind of blond wood that Bezos used twenty years ago when he was building Amazon from scratch in his garage. The door-desks are often held up as a symbol of the company’s enduring frugality."<br />...<br />They set up shop in the converted garage of Bezos’s house, an enclosed space without insulation and with a large, black potbellied stove at its center. Bezos built the first two desks out of sixty-dollar blond-wood doors from Home Depot, an endeavor that later carried almost biblical significance at Amazon, like Noah building the ark.<br />...<br />"Door-Desk award, given to an employee who came up with “a well-built idea that helps us to deliver lower prices to customers”—the prize was a door-desk ornament. Bezos was once again looking for ways to reinforce his values within the company."<br />...<br />"Conference-room tables are a collection of blond-wood door-desks shoved together side by side. The vending machines take credit cards, and food in the company cafeterias is not subsidized. When a new hire joins the company, he gets a backpack with a power adapter, a laptop dock, and some orientation materials. When someone resigns, he is asked to hand in all that equipment—including the backpack." [The Everything Store, by Brad Stone]</blockquote><h4 style="text-align: left;">So what does this have to do with Flipkart?</h4>Flipkart has been in business for (almost) ten years now (it was founded in October 2007). It has raised more than <a href="https://en.wikipedia.org/wiki/Flipkart">$4 billion dollars from investors</a>, the most recent round of funding closing in early 2017. The Indian e-commerce pioneer however has yet to make a single new paisa in profit. In its fiscal year ending March 31st, 2016, its <a href="http://www.business-standard.com/article/economy-policy/flipkart-s-losses-double-to-rs-2-306-crore-in-fy16-on-hand-of-amazon-116112800242_1.html">losses doubled to ₹2,306 crores</a>&nbsp;(approximately US$350 million). Keep that in mind as you go through this post.<br /><br />In <b>October 2014</b>, coming off the back of two funding rounds that saw it raise more than $1 billion from investors, came news that <a href="http://profit.ndtv.com/news/corporates/article-flipkart-leases-office-space-in-bangalore-from-realty-firm-embassy-682014">Flipkart had entered into an agreement to lease 3 million square feet</a> of prime office space for an estimated annual rent of ₹300 crores (approximately US$48 million at the then exchange rates). This figure was cut down to 2 million sq ft by the time the deal was&nbsp;<a href="http://economictimes.indiatimes.com/wealth/personal-finance-news/flipkart-leases-2-million-square-feet-for-20-years-office-space-single-largest-in-india/articleshow/47168691.cms">announced</a>&nbsp;in May 2015. Even with the reduced commitment, it was, at the time,&nbsp;<a href="http://economictimes.indiatimes.com/wealth/personal-finance-news/flipkart-leases-2-million-square-feet-for-20-years-office-space-single-largest-in-india/articleshow/47168691.cms">touted</a>&nbsp;as the "<b><i>single largest commitment of office space anywhere in the country</i>.</b>"<br /><br /></div>In late <b>2015</b>, several news sites, including the&nbsp;<a href="http://retail.economictimes.indiatimes.com/">Economic Times</a>, posted extensive photos of <b>Flipkart's&nbsp;<a href="http://retail.economictimes.indiatimes.com/slide-shows/flipkart-reveals-exclusive-pictures-of-its-new-office-in-bangalore/49012537">new office</a>&nbsp;at the Cessna Business Park in Bengaluru</b>. A cursory look at the office, as revealed by the photos, told a story of a no-expenses spared philosophy at work. Each floor had a "<i>theme inspired by human greatness in various fields – science, sports, fashion, music</i>". Hallways were designed to resemble running tracks, with the Olympic logo emblazoned prominently.<br /><div class="separator" style="clear: both; text-align: center;"><a href="https://4.bp.blogspot.com/-HnpzOn2EavU/WH0Soo6CWuI/AAAAAAAANtE/F0rJoyJaXMobJmwMb5CzAyfm6GCTRoY_wCLcB/s1600/5262.PNG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="216" src="https://4.bp.blogspot.com/-HnpzOn2EavU/WH0Soo6CWuI/AAAAAAAANtE/F0rJoyJaXMobJmwMb5CzAyfm6GCTRoY_wCLcB/s320/5262.PNG" width="320" /></a></div><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://3.bp.blogspot.com/-B_QU3yVchmQ/WH0Sn49CV-I/AAAAAAAANs8/ruYkhXgz6J8to2rMYfsmgGoY8UKEm7btgCLcB/s1600/5263.PNG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="260" src="https://3.bp.blogspot.com/-B_QU3yVchmQ/WH0Sn49CV-I/AAAAAAAANs8/ruYkhXgz6J8to2rMYfsmgGoY8UKEm7btgCLcB/s320/5263.PNG" width="320" /></a></div><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://1.bp.blogspot.com/-XDnfc2VdHoM/WH0Sohe3aCI/AAAAAAAANtA/0GyPL29bZgQYgFG9Ts1EGDPeVQK17YSLQCLcB/s1600/5264.PNG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="208" src="https://1.bp.blogspot.com/-XDnfc2VdHoM/WH0Sohe3aCI/AAAAAAAANtA/0GyPL29bZgQYgFG9Ts1EGDPeVQK17YSLQCLcB/s320/5264.PNG" width="320" /></a></div><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://1.bp.blogspot.com/-cuCywZpbKJc/WH0SrjtQszI/AAAAAAAANtI/HkzIitKg5MQYOe2Ywc25TZroDRTyehZtwCLcB/s1600/5265.PNG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="193" src="https://1.bp.blogspot.com/-cuCywZpbKJc/WH0SrjtQszI/AAAAAAAANtI/HkzIitKg5MQYOe2Ywc25TZroDRTyehZtwCLcB/s320/5265.PNG" width="320" /></a></div><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://4.bp.blogspot.com/-EfRvvbL7J4o/WH0StbsMlpI/AAAAAAAANtM/E8q8BHpeP1cuaEK32OjASh79Rd9IZs91ACLcB/s1600/5266.PNG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="209" src="https://4.bp.blogspot.com/-EfRvvbL7J4o/WH0StbsMlpI/AAAAAAAANtM/E8q8BHpeP1cuaEK32OjASh79Rd9IZs91ACLcB/s320/5266.PNG" width="320" /></a></div><br /><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-Go0gPowOGrA/WH0St-bsDpI/AAAAAAAANtQ/VHFZa8Yl_X8v1WE7pv_ltlf59TR2jPoGgCLcB/s1600/5267.PNG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="212" src="https://2.bp.blogspot.com/-Go0gPowOGrA/WH0St-bsDpI/AAAAAAAANtQ/VHFZa8Yl_X8v1WE7pv_ltlf59TR2jPoGgCLcB/s320/5267.PNG" width="320" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">Images credit: Economic Times</td></tr></tbody></table>By 2016, Flipkart's numerous missteps had only compounded its woes in the face of an unrelenting foe in the form of a rampaging Amazon. In <b>November 2016</b>, therefore, the&nbsp;<a href="http://economictimes.indiatimes.com/wealth/real-estate/flipkart-goes-lean-on-office-space-as-part-of-austerity-drive/articleshow/55301067.cms">news</a>&nbsp;came as no surprise that that <b>Flipkart had decided to forego almost half of the office space it had signed up for a year ago</b>. Instead of the two-million square feet, the company wanted no more than 1.2 million sq-ft. In addition, it had negotiated lowered fitment costs from ₹2400 to ₹1500 per sq-ft.<br /><br /><b>Juxtapositions are meant to contrast. They can also be cruel.&nbsp;</b><br />Like when it was <a href="http://techcircle.vccircle.com/2017/01/23/amazon-leases-over-a-million-sq-ft-of-office-space-in-2016/">reported in Jan 2017</a> that Amazon had leased more than one million sq ft of office space in India in 2016. That Amazon had leased more office space in 2015 than it had in all its previous years of presence in India. That it was reported in June 2017 that Amazon had <a href="https://inc42.com/buzz/amazon-leases-office-space-hyderabad/">leased 600,000 sq ft of office space in Hyderabad</a>.<br /><br />On top of this juxtaposition, let's add a dash of irony. Both of Flipkart's founders, Sachin Bansal and Binny Bansal, had worked at Amazon before leaving to start Flipkart. Jeff Bezos' mantra of frugality had either never been learned, or had perhaps been buried under the billions of investor money.<br /><br />Since we are talking about contrasts, let me end with one more. In September 2016, <a href="http://www.rediff.com/business/report/tech-flipkart-laying-off-at-least-700-employees-to-cut-costs/20160729.htm">it was reported</a> that <a href="http://www.thehindubusinessline.com/companies/flipkart-to-sack-800-more-amidst-gloomy-biz-outlook/article9091342.ece">Flipkart was planning to cut its staff by 800</a>, on top of 400 "<i>performance-related</i>" exits in July. In May 2016, it communicated to India's premier management institutes - Indian Institutes of Management at Ahmedabad, Bangalore, Lucknow, and the Faculty of Management Studies, Delhi - that it would <a href="http://www.rediff.com/business/report/iims-passouts-on-sticky-wicket-as-flipkart-defers-recruitment/20160526.htm">defer the joining dates of students it had made job offers to by six months</a>. In <a href="http://www.rediff.com/business/report/iims-passouts-on-sticky-wicket-as-flipkart-defers-recruitment/20160526.htm">response</a>, "<i>The authorities at IIM-A have sent a strongly worded letter to Flipkart, marking other premier B-schools such as IIM-Bangalore, IIM-Lucknow and the Faculty of Management Studies, Delhi.</i>"<br /><br />&nbsp;What about Amazon? The company, in a <a href="http://www.businesswire.com/news/home/20170112005428/en/">press-release in January 2017</a>, announced that it&nbsp;would "<b><i>Create More Than 100,000 New, Full-Time, Full-Benefit Jobs across the U.S. over the Next 18 Months.</i></b>"<br /><br />What's the takeaway? That companies need to beware the <a href="http://www.businessinsider.com/poorly-timed-headquarters-2009-11?IR=T">curse of the new headquarters</a>? Or that founders need to focus on companies that can stand on their own feet? That CEOs need to focus on execution? That boards and investors cannot function as absentee landlords?<br /><br /><span style="background-color: white; font-family: &quot;times new roman&quot;; font-size: 15.4px;">[<i>I have written at length on this fascinating slugfest. When I read about and witnessed its mobile-only obsession I had called it a dangerous&nbsp;</i></span><i style="background-color: white; font-family: Georgia, Utopia, &quot;Palatino Linotype&quot;, Palatino, serif; font-size: 15.4px;"><a href="http://blog.abhinavagarwal.net/2015/04/is-flipkart-losing-focus-1.html" style="color: #073763; font-family: &quot;times new roman&quot;; text-decoration-line: none;">distraction</a><span style="font-family: &quot;times new roman&quot;;">, not to mention a&nbsp;</span><a href="http://blog.abhinavagarwal.net/2015/05/flipkart-and-focus-2-mobile-advertising.html" style="color: #073763; font-family: &quot;times new roman&quot;; text-decoration-line: none;">revenue chimera</a><span style="font-family: &quot;times new roman&quot;;">&nbsp;and a&nbsp;</span><a href="http://blog.abhinavagarwal.net/2015/05/flipkart-and-focus-3-theres-something.html" style="color: #073763; font-family: &quot;times new roman&quot;; text-decoration-line: none;">privacy</a><span style="font-family: &quot;times new roman&quot;;">&nbsp;nightmare. I warned that Flipkart was making a mistake, a big mistake, in&nbsp;</span><a href="http://blog.abhinavagarwal.net/2015/05/flipkart-and-focus-4-beware-whispering.html" style="color: #073763; font-family: &quot;times new roman&quot;; text-decoration-line: none;">taking its eye off the ball</a><span style="font-family: &quot;times new roman&quot;;">&nbsp;in competing against Amazon, using a cricket analogy that should have been familiar to the Indian founders. I wrote about how&nbsp;</span><a href="http://blog.abhinavagarwal.net/2016/04/flipkart-million-dollar-hiring-mistakes.html" style="color: #073763; font-family: &quot;times new roman&quot;; text-decoration-line: none;">hubris-driven million-dollar hires</a><span style="font-family: &quot;times new roman&quot;;">&nbsp;had resulted in billion dollar erosions in valuations. I wrote about what had become an</span><span style="font-family: &quot;times new roman&quot;;">&nbsp;</span><a href="http://blog.abhinavagarwal.net/2017/02/flipkart-and-revolving-door.html" style="color: #073763; font-family: &quot;times new roman&quot;; text-decoration-line: none;">ever-revolving door of executive exits</a><span style="font-family: &quot;times new roman&quot;;">&nbsp;</span></i><span style="background-color: white; font-family: &quot;times new roman&quot;; font-size: 15.4px;"><i>at Flipkart. I wrote about <a href="https://www.linkedin.com/pulse/flipkart-art-brand-management-escapades-abhinav-agarwal">brand management snafus at Flipkart</a>.</i>]</span><br /><div><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://2.bp.blogspot.com/-YUoAi7vNy_E/WH38R11J4ZI/AAAAAAAANto/TKRBNf4Vzkkf4NVMJOR92QHJTCajXb9yQCLcB/s1600/5269.PNG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="236" src="https://2.bp.blogspot.com/-YUoAi7vNy_E/WH38R11J4ZI/AAAAAAAANto/TKRBNf4Vzkkf4NVMJOR92QHJTCajXb9yQCLcB/s320/5269.PNG" width="320" /></a></div><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://4.bp.blogspot.com/-1jsLG6vUhLQ/WH38RwQwxQI/AAAAAAAANtk/4FVCloSNo4ApsYI2UaVkjdifZ5NfKluRACLcB/s1600/5270.PNG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="239" src="https://4.bp.blogspot.com/-1jsLG6vUhLQ/WH38RwQwxQI/AAAAAAAANtk/4FVCloSNo4ApsYI2UaVkjdifZ5NfKluRACLcB/s320/5270.PNG" width="320" /></a></div><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://1.bp.blogspot.com/-7WWq1vDsOD8/WH38R2GaMQI/AAAAAAAANtg/CmfzYPesgRwiQovNSoP7ngaipky2NVNoQCLcB/s1600/5271.PNG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="241" src="https://1.bp.blogspot.com/-7WWq1vDsOD8/WH38R2GaMQI/AAAAAAAANtg/CmfzYPesgRwiQovNSoP7ngaipky2NVNoQCLcB/s320/5271.PNG" width="320" /></a></div><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://1.bp.blogspot.com/-P7fsM7RKKnU/WH38TmFQv7I/AAAAAAAANt0/P5Qg1jMJMkAyQxBz4CfqEK3beRPEwirYwCLcB/s1600/5272.PNG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="260" src="https://1.bp.blogspot.com/-P7fsM7RKKnU/WH38TmFQv7I/AAAAAAAANt0/P5Qg1jMJMkAyQxBz4CfqEK3beRPEwirYwCLcB/s320/5272.PNG" width="320" /></a></div><br /><div class="separator" style="clear: both; text-align: center;"><a href="https://1.bp.blogspot.com/-g0yvkAW1t0M/WH38Tgci1HI/AAAAAAAANts/430K8TvdnxMHUJumjAvFneEUukT7YrtagCLcB/s1600/5273.PNG" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="249" src="https://1.bp.blogspot.com/-g0yvkAW1t0M/WH38Tgci1HI/AAAAAAAANts/430K8TvdnxMHUJumjAvFneEUukT7YrtagCLcB/s320/5273.PNG" width="320" /></a></div><br /><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-Jzj0t8eqlKE/WH38Ti_PRiI/AAAAAAAANtw/JBO3SFvSwasgiZ2g5-aODAhuWObTJbcFACLcB/s1600/5274.PNG" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="239" src="https://2.bp.blogspot.com/-Jzj0t8eqlKE/WH38Ti_PRiI/AAAAAAAANtw/JBO3SFvSwasgiZ2g5-aODAhuWObTJbcFACLcB/s320/5274.PNG" width="320" /></a></td></tr><tr><td class="tr-caption" style="font-size: 12.8px;">Images credit:&nbsp;<a href="https://in.news.yahoo.com/photos/flipkart-s-snazzy-new-office-in-bangalore-1437970597-slideshow/flipkart-s-new-office-in-bangalore-photo-1437968481086.html" style="font-size: medium; text-align: left;">Yahoo</a></td></tr></tbody></table><i>This post first appeared in <a href="https://www.linkedin.com/today/author/abhinavagarwal">LinkedIn Pulse</a> on <a href="https://www.linkedin.com/pulse/one-mantra-startups-waste-vacate-abhinav-agarwal">July 23rd, 2017</a>.</i><br /><br /></div><div>"<b>The Everything Store:&nbsp;Jeff Bezos and the Age of Amazon</b>" (<a href="http://amzn.to/2eE8uh7">US</a>,&nbsp;<a href="http://amzn.to/2vLy5rR">IN</a>,&nbsp;<a href="https://www.amazon.com/Everything-Store-Jeff-Bezos-Amazon-ebook/dp/B00DJ3ITKS/tag=abhinav-20">Kindle US</a>,&nbsp;<a href="http://amzn.to/2vLxfvk">Kindle IN</a>)<br /><iframe frameborder="0" marginheight="0" marginwidth="0" scrolling="no" src="//ws-na.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&amp;OneJS=1&amp;Operation=GetAdHtml&amp;MarketPlace=US&amp;source=ss&amp;ref=as_ss_li_til&amp;ad_type=product_link&amp;tracking_id=abhinav-20&amp;marketplace=amazon&amp;region=US&amp;placement=0316219282&amp;asins=0316219282&amp;linkId=3d7a0dd36b8bb4d9c443ffce99e2a28a&amp;show_border=true&amp;link_opens_in_new_window=true" style="height: 240px; width: 120px;"></iframe> <iframe frameborder="0" marginheight="0" marginwidth="0" scrolling="no" src="//ws-na.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&amp;OneJS=1&amp;Operation=GetAdHtml&amp;MarketPlace=US&amp;source=ss&amp;ref=as_ss_li_til&amp;ad_type=product_link&amp;tracking_id=abhinav-20&amp;marketplace=amazon&amp;region=US&amp;placement=B00DJ3ITKS&amp;asins=B00DJ3ITKS&amp;linkId=3d7a0dd36b8bb4d9c443ffce99e2a28a&amp;show_border=true&amp;link_opens_in_new_window=true" style="height: 240px; width: 120px;"></iframe> <iframe frameborder="0" marginheight="0" marginwidth="0" scrolling="no" src="//ws-in.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&amp;OneJS=1&amp;Operation=GetAdHtml&amp;MarketPlace=IN&amp;source=ss&amp;ref=as_ss_li_til&amp;ad_type=product_link&amp;tracking_id=abhisblog-21&amp;marketplace=amazon&amp;region=IN&amp;placement=0552167835&amp;asins=0552167835&amp;linkId=7b60f888241df34416cbc31d6a38c2a4&amp;show_border=true&amp;link_opens_in_new_window=true" style="height: 240px; width: 120px;"></iframe> <iframe frameborder="0" marginheight="0" marginwidth="0" scrolling="no" src="//ws-in.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&amp;OneJS=1&amp;Operation=GetAdHtml&amp;MarketPlace=IN&amp;source=ss&amp;ref=as_ss_li_til&amp;ad_type=product_link&amp;tracking_id=abhisblog-21&amp;marketplace=amazon&amp;region=IN&amp;placement=B00DJ3ITKS&amp;asins=B00DJ3ITKS&amp;linkId=f52cd71d3ad047925b7330adcd470fef&amp;show_border=true&amp;link_opens_in_new_window=true" style="height: 240px; width: 120px;"></iframe><br /><br /><iframe allowfullscreen="" frameborder="0" height="550" src="https://read.amazon.in/kp/card?asin=B00DJ3ITKS&amp;preview=inline&amp;linkCode=kpe&amp;ref_=cm_sw_r_kb_dp_Kg6CzbY224PRY&amp;tag=abhisblog-21" style="max-width: 100%;" type="text/html" width="336"></iframe><br /><br /></div><span style="color: #666666; font-size: x-small;">© 2017, Abhinav Agarwal. All rights reserved.</span></div> Abhinav Agarwal tag:blogger.com,1999:blog-13714584.post-9018861829240305220 Mon Jul 24 2017 16:28:00 GMT-0400 (EDT) Part 4b - (Article 22: Profiling) Why me? and how Oracle 12c saves the day http://www.oralytics.com/2017/07/part-4b-article-22-profiling-why-me-and.html <p>This is the fourth part of series of blog posts on '<a href="http://www.oralytics.com/2017/06/how-eu-gdpr-will-affect-use-of-machine.html">How the EU GDPR will affect the use of Machine Learning</a>'</p> <p>In this blog post (Part4b) I will examine some of the more technical aspects and how the in-database machine learning functions saves the day!</p> <p>Probably in most cases where machine learning has been used and/or deployed in your company to analyse, profile and predict customers, it is more than likely that some sort of black box machine learning has been used.</p> <p><img src="https://lh3.googleusercontent.com/-TdukjdRnRdI/WVZnSq10niI/AAAAAAAAMN8/BF9WUInuMr89WbmMcF9FXVvf5QrOgUyVwCHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="110" height="110" /></p> <p>Typical black box machine learning will include using algorithms like Neural Networks, but these can extended to other algorithms, within the context of the EU GDPR requirements, such as SVMs, GLM, etc. Additionally most companies don't just use one algorithm to make a decision on a customer. Many algorithms and rules based decision make can be used together, using some sort of voting system, to determine if a customer is targeted in a certain way.</p> <p>Basically all of these do not really support the requirements of the EU GDPRs.</p> <p><img src="https://lh3.googleusercontent.com/-X8iAna_26w8/WVZrUtnMomI/AAAAAAAAMOI/OogoG8IplVQaTVGYf35FGxrLJ6WHwBm3QCHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="307" height="173" /></p> <p>In most cases we need to go back to basics. Back to more simpler approaches of machine learning for customer profiling and prediction. This means no more, for now, ensemble models, unless you can explain why a customer was selected. This means having to use simple algorithms like Decision Trees, at a push Naive Bayes, and using some well defined rules based methods. All of these approaches allows us to see and understand why a customer was selected and based on Article 22 being able to explain why.</p> <p>But there is some hope. Some of the commercial machine learning vendors already for some prediction insights built into their software. Very few if any open source solutions have this capability.</p> <p>For example, Oracle introduced a new function called PREDICTION_DETAILS in Oracle 12.1c and this was expanded in Oracle 12.2c to cover all their in-database machine learning algorithms.</p> <p>The following is an example of using this function for an SVM model. When you examine the boxes in the following image you an see that a slightly different set of attributes and the values of these attributes are listed. Each box corresponds to a different customer. This means we can give an explanation of why a customer was selected. Oracle 12c saves the day.</p> <pre><br />select cust_id, <br /> prediction(clas_svm_1_27 using *) pred_value, <br /> prediction_probability(clas_svm_1_27 using *) pred_prob, <br /> prediction_details(clas_svm_1_27 using *) pred_details <br />from mining_data_apply_v;<br /></pre> <p><img src="https://lh3.googleusercontent.com/-IY9rph-yc7U/WVUFCGnIIjI/AAAAAAAAMNc/iy8G9Vzd7ZEDY7yooo7sYdn74TgMIpkxACHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="573" height="322" /></p> <p>If you have a look at other commercial machine learning solutions, you will find some give similar functionality or it will be available soon. Can we get the same level of detail from open source solutions. Not really unless you are using Decision Tress and maybe Naive Bayes. This means that companies that have gone done the pure open source for their machine learning may have to look at using alternative software and may have to folk out some hard earned dollars/euros to make sure that they are complainant with Article 22 of the EU GDPRs.</p> <br><p>Click back to '<a href="http://www.oralytics.com/2017/06/how-eu-gdpr-will-affect-use-of-machine.html">How the EU GDPR will affect the use of Machine Learning - Part 1</a>' for links to all the blog posts in this series.</p> Brendan Tierney tag:blogger.com,1999:blog-4669933501315263808.post-8693889995945589392 Mon Jul 24 2017 11:19:00 GMT-0400 (EDT) A Birthday This Big Deserves an Entire Year of Celebrations – Happy Birthday ODTUG! http://www.odtug.com/p/bl/et/blogaid=739&source=1 A Birthday This Big Deserves an Entire Year of Celebrations – Happy Birthday ODTUG! If you think you know everything about ODTUG’s history – think again! We’ve picked the minds of many long-time ODTUGers and compiled this list of memorable ODTUG milestones. ODTUG http://www.odtug.com/p/bl/et/blogaid=739&source=1 Wed Jul 19 2017 16:12:34 GMT-0400 (EDT) Oracle Enterprise Performance Reporting Cloud Service (EPRCS) 101 http://blog.performancearchitects.com/wp/2017/07/19/oracle-enterprise-performance-reporting-cloud-service-eprcs-101/ <p>Author: Mike McLean, Performance Architects</p> <p>As organizations face more complex internal and external reporting requirements.  It is no longer sufficient to just provide the “numbers;” organizations need the ability to provide narrative in order to support and explain financial results.  Companies require a framework whereby they can collaborate across multiple sources and locations.  <a href="https://cloud.oracle.com/enterprise-performance-reporting-cloud">Oracle’s Enterprise Performance Reporting Cloud Service (EPRCS)</a> provides a solution.</p> <p>EPRCS provides three steps in the report creation process: “Author,” “Review,” and “Sign-Off.” Work flow (with start and end dates) and security can be assigned to each step.</p> <p>Content is created in the “Author” step.  EPRCS utilizes “doclets” to create content.  Doclets are individual components of a report that can be assigned to multiple contributors.  Security can be assigned to the doclet so that only the necessary individuals can access it.  Only one author can sign out a doclet at time.  When a doclet is signed in, a new version is created.  Doclets are then grouped together to create a “Report Package.”</p> <p>The “Review” step allows users to review and edit the content of a report package.</p> <p>In the “Sign-Off” step, users can approve and publish the report package or they can request edits.  Once the report package is approved, no additional edits can be made.</p> <p>EPRCS contains a library where all artifacts of the application are stored.</p> <p>If you have any questions, please drop us a note at <a href="mailto:sales@performancearchitects.com">sales@performancearchitects.com</a>, and we’ll see what we can do to help.</p> Melanie Mathews http://blog.performancearchitects.com/wp/?p=2076 Wed Jul 19 2017 05:05:40 GMT-0400 (EDT) Part 4a - (Article 22: Profiling) Why me? and how Oracle 12c saves the day http://www.oralytics.com/2017/07/part-4a-article-22-profiling-why-me-and.html <p>This is the fourth part of series of blog posts on '<a href="http://www.oralytics.com/2017/06/how-eu-gdpr-will-affect-use-of-machine.html">How the EU GDPR will affect the use of Machine Learning</a>'</p> <p>In this blog post (Part4a) I will discuss the specific issues relating to the use of machine learning algorithms and models. In the next blog post (Part 4a) I will examine some of the more technical aspects and how the in-database machine learning functions saves the day!</p> <p>The EU GDPR has some rules that will affect the use of machine learning models for predicting customers.</p> <p><img src="https://lh3.googleusercontent.com/-nGcqmAEmeuI/WVTRKjlyo7I/AAAAAAAAMM4/aluymsPmHTI5FS-OKfdxmLeo9EPgUF41ACHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="481" height="287" /></p> <p>As with all the other section of the EU GDPR, the use of machine learning and profiling of individuals does not affect organisations based in within Europe but affects all organisations around the globe who will be using these methods and associated data.</p> <p>Article 22 of the EU GDPR deals with the “Automated individual decision-making, including profiling” and effectively creates a “right to explanation”. This means that an individual is entitled to an explanation of the decisions made by automated decision making models or profiling that has resulted in a decision being made about them. These new regulations present many challenges for organisations and their teams of data scientists.</p> <p><img src="https://lh3.googleusercontent.com/-XQWOe_v1BNA/WVTRVtRBiVI/AAAAAAAAMM8/xZ9_3Jack6caBRcKE3JYlnrCyQfFDs70wCHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="390" height="130" /></p> <p>To be able to give an explanation of the decision made by the machine learning models or by profile, requires the ability of the underlying models and their associated algorithms to be able to gives details of the model processing and how the decision about the individual has been obtained. For most machine learning models and algorithms this is generally not possible. For a limited set of algorithms, for example with decision trees, this is possible, but with other algorithms such as support vector machines, some regression models, and in particular neural networks, the ability to give these explanations is not possible. Some of these can be considered black box modelling (for neural networks) and grey box modelling for the others. But these algorithms are in widespread use in many organisations and are core to their predictive analytics solutions. This presents many challenges for organisations as they will need to look at alternative algorithms that many not have the same degree of predictive accuracy. With the recent rise of deep learning using neural networks, is extremely difficult to explain the multilayer neural net with various learned weights between each of the nodes at each layer.</p> <p><img src="https://lh3.googleusercontent.com/-yEO-OdMHCNc/WVTRhhtUKaI/AAAAAAAAMNA/g9M-Y503e0sEerS59iWMOYfL80pYJvuPwCHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="360" height="150" /></p> <p>Ensemble machine learning methods like Random Forests are also a challenge. Although the underlying machine learning algorithm is explainable, the ensemble approach of Random Forest, and other similar methods, result from an aggregation, averaging or voting process. Additionally, scenarios when machine learning models are combine with multiple other models, along with rules based solutions, where the predicted outcome is based on the aggregation or voting of all methods may no longer be useable. The ability to explain a predicted outcome using ensemble methods may not be possible and this will affect their continued use for predictive analytics.</p> <p><img src="https://lh3.googleusercontent.com/-HPDiY81jQMg/WVTSEId6YkI/AAAAAAAAMNI/5GGvIOe6lugQTZVXbGiY_V2z5-f17muvQCHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="250" height="200" /></p> <p>In addition to the requirements of Article 22, Articles 13 and 14 state that the a person has a right to the meaningful information about the logic involved in profiling the person.</p> <p>Over the past few years many of the commercially available machine learning solutions have been preparing for changes required to meet the EU GDPR. Some vendors have been able to add in greater model explanation features as well as specific explanations for each of the individual predictions. Many other vendors are will working on adding the required level of explanations and some of these many not be available in time for when the EU GDPR goes live in April 2018. This will present many challenges for organisations around the world who will be using data gathered within the EU region.</p> <p>For machine learning based on open source languages and tools the EU GDPR present a very different challenge. While a small number of these come with some simple explanations for some of the more basic machine learning algorithms, there seems to be little information available on what work is currently being done to update these languages and tools. The limiting factor with making the required updates in the open source community lies with there being no commercial push to so. As a result of these limitation, many organisations may be forced into using commercial machine learning products, but for many other organisation the cost of doing so will be prohibitive.</p> <p>It is clear that the tasks of building machine learning models have become significantly more complex with the introduction of the new EU GDPR. This complexity applies to the selection of what data can be used, ensuring there is no inherent discrimination in the machine learning models and the ability of these models to give an explanation of how the predicted outcome was determined. Companies around the World need to address these issues and in doing so may limit what software and algorithms that can be used for the customer profiling and predictive analytics. Although some of the commercially available machine learning languages and products can give the required insights, more product enhancements are required. Many challenges are facing machine learning open source community, with many research group only starting in recent months to look at how their languages, packages and tools can be enhanced to facilitate the requirements of the EU GDPR.</p> <br><p>Click back to '<a href="http://www.oralytics.com/2017/06/how-eu-gdpr-will-affect-use-of-machine.html">How the EU GDPR will affect the use of Machine Learning - Part 1</a>' for links to all the blog posts in this series.</p> Brendan Tierney tag:blogger.com,1999:blog-4669933501315263808.post-1592055788261240012 Mon Jul 17 2017 11:12:00 GMT-0400 (EDT) Analyzing Wimbledon Twitter Feeds in Real Time with Kafka, Presto and Oracle DVD v3 http://www.rittmanmead.com/blog/2017/07/analyzing-wimbledon-twitter-feeds-in-real-time-with-kafka-presto-and-oracle-dvd-v3/ <img src="http://www.rittmanmead.com/blog/content/images/2017/07/Presto2-3.gif" alt="Analyzing Wimbledon Twitter Feeds in Real Time with Kafka, Presto and Oracle DVD v3"><p>Last week there was Wimbledon, if you are a fan of Federer, Nadal or Djokovic then it was one of the events not to be missed. I deliberately excluded Andy Murray from the list above since he kicked out my favourite player: Dustin Brown.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/07/giphy-downsized-large.gif" alt="Analyzing Wimbledon Twitter Feeds in Real Time with Kafka, Presto and Oracle DVD v3"></p> <p>Two weeks ago I was at <a href="http://kscope17.com">Kscope17</a> and one of the common themes, which reflected where the industry is going, was the usage of <a href="http://kafka.apache.org">Kafka</a> as central hub for all data pipelines. I wont go in detail on what's the specific role of Kafka and how it accomplishes, You can grab the idea from two slides taken from a recent presentation by <a href="https://speakerdeck.com/rmoff/real-time-data-integration-at-scale-with-kafka-connect-dublin-apache-kafka-meetup-04-jul-2017">Confluent</a>.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/07/Kafka-1.png" alt="Analyzing Wimbledon Twitter Feeds in Real Time with Kafka, Presto and Oracle DVD v3"></p> <p>One of the key points of all Kafka-related discussions at Kscope was that Kafka is widely used to take data from providers and push it to specific data-stores (like HDFS) that are then queried by analytical tools. However the "parking to data-store" step can sometimes be omitted with analytical tools querying directly Kafka for real-time analytics. </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/07/1sfgw5.jpg" alt="Analyzing Wimbledon Twitter Feeds in Real Time with Kafka, Presto and Oracle DVD v3"></p> <p>We wrote at the beginning of the year a blog post about doing it with <a href="https://www.rittmanmead.com/blog/2017/01/getting-started-with-spark-streaming-with-python-and-kafka/">Spark Streaming and Python</a> however that setup was more data-scientist oriented and didn't provide a simple ANSI SQL familiar to the beloved end-users.</p> <p>As usual, Oracle annouced a new release during Kscope. This year it was <a href="http://www.oracle.com/technetwork/middleware/oracle-data-visualization/downloads/oracle-data-visualization-desktop-2938957.html">Oracle Data Visualization Desktop 12.2.3.0.0</a> with a bunch of new features covered in my previous <a href="https://www.rittmanmead.com/blog/2017/07/oracle-data-visualization-desktop-v3-new-features/">blog post</a>. <br> The enhancement, amongst others, that made my day was the support for JDBC and ODBC drivers. It opened a whole bundle of opportunities to query tools not officially supported by DVD but that expose those type of connectors.</p> <p>One of the tools that fits in this category is <a href="https://prestodb.io">Presto</a>, a distributed query engine belonging to the same family of <a href="https://www.rittmanmead.com/blog/2017/04/sql-on-hadoop-impala-vs-drill/">Impala and Drill</a> commonly referred as sql-on-Hadoop. A big plus of this tool, compared to the other two mentioned above, is that it queries natively Kafka via a <a href="https://prestodb.io/docs/current/connector/kafka-tutorial.html">dedicated connector</a>.</p> <p>I found then a way of fitting the two of the main Kscope17 topics, a new sql-on-Hadoop tool and one of my favourite sports (Tennis) in the same blog post: analysing real time Twitter Feeds with Kafka, Presto and Oracle DVD v3. Not bad as idea.... let's check if it works...</p> <h1 id="analysingtwitterfeeds">Analysing Twitter Feeds</h1> <p>Let's start from the actual fun: analysing the tweets! We can navigate to the <a href="https://www.oracle.com/goto/OAStore">Oracle Analytics Store</a> and download some interesting add-ins we'll use: the <strong>Auto Refresh</strong> plugin that enables the refresh of the DV project, the <strong>Heat Map</strong> and <strong>Circle Pack</strong> visualizations and the <strong>Term Frequency</strong> advanced analytics pack.</p> <p>Importing the plugin and new visualizations can be done directly in the console as explained in my <a href="https://www.rittmanmead.com/blog/2017/07/oracle-data-visualization-desktop-v3-new-features/">previous post</a>. In order to be able to use the advanced analytics function we need to unzip the related file and move the <code>.xml</code> file contained in the <code>%INSTALL_DIR%\OracleBI1\bifoundation\advanced_analytics\script_repository</code>. In the Advanced Analytics zip file there is also a <code>.dva</code> project that we can import into DVD (password <em>Admin123</em>) which gives us a hint on how to use the function.</p> <p>We can now build a DVD Project about the Wimbledon gentleman singular final containing:</p> <ul> <li>A table view showing the latest tweets</li> <li>A horizontal bar chart showing the number of tweets containing mentions to Federer, Cilic or Both</li> <li>A circle view showing the most tweeted terms</li> <li>A heatmap showing tweet locations (only for tweets with an activated localization)</li> <li>A line chart showing the number of tweets over time</li> </ul> <p>The project is automatically refreshed using the auto-refresh plugin mentioned above. A quick view of the result is provided by the following image.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/07/Presto2-2.gif" alt="Analyzing Wimbledon Twitter Feeds in Real Time with Kafka, Presto and Oracle DVD v3"></p> <p>So far all good and simple! Now it's time to go back and check how the data is collected and queried. Let's start from Step #1: pushing Twitter data to <strong>Kafka</strong>! </p> <h1 id="kafka">Kafka</h1> <p>We covered Kafka installation and setup in previous <a href="https://www.rittmanmead.com/blog/2015/10/forays-into-kafka-01-logstash-transport-centralisation/">blog post</a>, so I'll not repeat this part. <br> The only piece I want to mention, since gave me troubles, is the <code>advertised.host.name</code> setting: it's a configuration line in <code>/opt/kafka*/config/server.properties</code> that tells Kafka which is the host where it's listening. </p> <p>If you leave the default <code>localhost</code> and try to push content to a topic from an external machine it will not show up, so as pre-requisite change it to a hostname/IP that can be resolved externally.</p> <p>The rest of the Kafka setup is the creation of a Twitter producer, I took <a href="https://dzone.com/articles/how-to-write-a-kafka-producer-using-twitter-stream">this Java project</a> as example and changed it to use the latest Kafka release available in <a href="https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients/0.11.0.0">Maven</a>. It allowed me to create a Kafka topic named <code>rm.wimbledon</code> storing tweets containing the word <code>Wimbledon</code>.</p> <p>The same output could be achieved using <a href="https://www.confluent.io/product/connectors/">Kafka Connect</a> and its <a href="https://github.com/jcustenborder/kafka-connect-twitter">sink and source for twitter</a>. Kafka Connect has also the benefit of being able to transform the data before landing it in Kafka making the data parsing easier and the storage faster to retrieve. I'll cover the usage of Kafka Connect in a future post, for more informations about it, check <a href="https://speakerdeck.com/rmoff/real-time-data-integration-at-scale-with-kafka-connect-dublin-apache-kafka-meetup-04-jul-2017">this</a> presentation from Robin Moffatt of Confluent.</p> <p>One final note about Kafka: I run a command to limit the retention to few minutes</p> <pre><code>bin/kafka-topics.sh --zookeeper localhost:2181 --alter --topic rm.wimbledon --config retention.ms=300000 </code></pre> <p>This limits the amount of data that is kept in Kafka, providing better performances during query time. This is not always possible in Kafka due to data collection needs and there are other ways of optimizing the query if necessary. </p> <p>At this point of our project we have a dataflow from Twitter to Kafka, but no known way of querying it with DVD. It's time to introduce the query engine: <strong>Presto</strong>!</p> <h1 id="presto">Presto</h1> <p>Presto was developed at Facebook, is in the family of sql-on-Hadoop tools. However, as per <a href="https://www.rittmanmead.com/blog/2017/04/sql-on-hadoop-impala-vs-drill/">Apache Drill</a>, it could be called <strong>sql-on-everything</strong> since data don't need to reside on an Hadoop system. Presto can query local file systems, MongoDB, Hive, and a big variety of <a href="https://prestodb.io/docs/current/connector.html">datasources</a>.</p> <p>As the other sql-on-Hadoop technologies it works with always-on daemons which avoid the latency proper of Hive in starting a MapReduce job. Presto, differently from the others, divides the daemons in two types: the Coordinator and the Worker. A <strong>Coordinator</strong> is a node that receives the query from the clients, it analyses and plans the execution which is then passed on to <strong>Workers</strong> to carry out. </p> <p>In other tools like <a href="https://www.rittmanmead.com/blog/2017/04/sql-on-hadoop-impala-vs-drill/">Impala and Drill</a> every node by default could add as both worker and receiver. The same can also happen in Presto but is not the default and the documentation suggest to dedicate a single machine to only perform coordination tasks for best performance in large cluster (reference to the <a href="https://prestodb.io/docs/current/installation/deployment.html">doc</a>).</p> <p>The following image, taken from Presto website, explains the flow in case of usage of the Hive metastore as datasource.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/07/presto-overview.png" alt="Analyzing Wimbledon Twitter Feeds in Real Time with Kafka, Presto and Oracle DVD v3"></p> <h2 id="installation">Installation</h2> <p>The default Presto installation procedure is pretty simple and can be found in the <a href="https://prestodb.io/docs/current/installation/deployment.html">official documentation</a>. We just need to download the <code>presto-server-0.180.tar.gz</code> tarball and unpack it.</p> <pre><code>tar -xvf presto-server-0.180.tar.gz </code></pre> <p>This creates a folder named <code>presto-server-0.180</code> which is the <em>installation</em> directory, the next step is to create a subfolder named <code>etc</code> which contains the configuration settings.</p> <p>Then we need to create five configuration files and a folder within the <code>etc</code> folder:</p> <ul> <li><code>node.environment</code>: configuration specific to each node, enables the configuration of a cluster</li> <li><code>jvm.config</code>: options for the Java Virtual Machine</li> <li><code>config.properties</code>: specific coordinator/worker settings</li> <li><code>log.properties</code>: specifies log levels</li> <li><code>catalog</code>: a folder that will contain the data source definition</li> </ul> <p>For a the basic functionality we need the following are the configurations:</p> <h3 id="nodeenvironment">node.environment</h3> <pre><code>node.environment=production node.id=ffffffff-ffff-ffff-ffff-ffffffffffff node.data-dir=/var/presto/data </code></pre> <p>With the <code>environment</code> parameter being shared across all the nodes in the cluster, the <code>id</code> being a unique identifier of the node and the <code>data-dir</code> the location where Presto will store logs and data.</p> <h3 id="jvmconfig">jvm.config</h3> <pre><code>-server -Xmx4G -XX:+UseG1GC -XX:G1HeapRegionSize=32M -XX:+UseGCOverheadLimit -XX:+ExplicitGCInvokesConcurrent -XX:+HeapDumpOnOutOfMemoryError -XX:+ExitOnOutOfMemoryError </code></pre> <p>I reduced the <code>-Xmx</code> parameter to 4GB as I'm running in a test VM. The parameters can of course be changed as needed.</p> <h3 id="configproperties">config.properties</h3> <p>Since we want to keep it simple we'll create a unique node acting both as coordinator and as worker, the related config file is:</p> <pre><code>coordinator=true node-scheduler.include-coordinator=true http-server.http.port=8080 query.max-memory=5GB query.max-memory-per-node=1GB discovery-server.enabled=true discovery.uri=http://linuxsrv.local.com:8080 </code></pre> <p>Where the <code>coordinator=true</code> tells Presto to function as coordinator, <code>http-server.http.port</code> defines the ports, and <code>discovery.uri</code> is the URI to the Discovery server (in this case the same process).</p> <h3 id="logproperties">log.properties</h3> <pre><code>com.facebook.presto=INFO </code></pre> <p>We can keep the default <code>INFO</code> level, other levels are <code>DEBUG</code>, <code>WARN</code> and <code>ERROR</code>.</p> <h3 id="catalog">catalog</h3> <p>The last step in the configuration is the datasource setting: we need to create a folder named <code>catalog</code> within <code>etc</code> and create a file for each connection we intend to use. </p> <p>For the purpose of this post we want to connect to the Kafka topic named <code>rm.wimbledon</code>. We need to create a file named <code>kafka.properties</code> within the <code>catalog</code> folder created above. The file contains the following lines</p> <pre><code>connector.name=kafka kafka.nodes=linuxsrv.local.com:9092 kafka.table-names=rm.wimbledon kafka.hide-internal-columns=false </code></pre> <p>where <code>kafka.nodes</code> points to the Kafka brokers and <code>kafka.table-names</code> defines the list of topics delimited by a <code>,</code>.</p> <p>The last bit needed is to start the Presto server by executing</p> <pre><code>bin/launcher start </code></pre> <p>We can append the <code>--verbose</code> parameter to debug the installation with logs that can be found in the <code>var/log</code> folder.</p> <h2 id="prestocommandlineclient">Presto Command Line Client</h2> <p>In order to query Presto via command line interface we just need to download the associated client (see <a href="https://prestodb.io/docs/current/installation/cli.html">official doc</a>) which is in the form of a <code>presto-cli-0.180-executable.jar</code> file. We can now rename the file to <code>presto</code> and make it executable.</p> <pre><code>mv presto-cli-0.180-executable.jar presto chmod +x presto </code></pre> <p>Then we can start the client by executing</p> <pre><code>./presto --server linuxsrv.local.com:8080 --catalog kafka --schema rm </code></pre> <p>Remember that the client has a JDK 1.8 as prerequisite, otherwise you will face an error. Once the client is successfully setup, we can start querying Kafka</p> <p>You could notice that the schema (<code>rm</code>) we're connecting is just the prefix of the <code>rm.wimbledon</code> topic used in kafka. In this way I could potentially store other topics using the same <code>rm</code> prefix and being able to query them all together.</p> <p>We can check which schemas can be used in Kafka with</p> <pre><code>presto:rm&gt; show schemas; Schema -------------------- information_schema rm (2 rows) </code></pre> <p>We can also check which topics are contained in <code>rm</code> schema by executing</p> <pre><code>presto:rm&gt; show tables; Table ----------- wimbledon (1 row) </code></pre> <p>or change schema by executing </p> <pre><code>use information_schema; </code></pre> <p>Going back to the Wimbledon example we can describe the content of the topic by executing</p> <pre><code>presto:rm&gt; describe wimbledon; Column | Type | Extra | Comment -------------------+---------+-------+--------------------------------------------- _partition_id | bigint | | Partition Id _partition_offset | bigint | | Offset for the message within the partition _segment_start | bigint | | Segment start offset _segment_end | bigint | | Segment end offset _segment_count | bigint | | Running message count per segment _key | varchar | | Key text _key_corrupt | boolean | | Key data is corrupt _key_length | bigint | | Total number of key bytes _message | varchar | | Message text _message_corrupt | boolean | | Message data is corrupt _message_length | bigint | | Total number of message bytes (11 rows) </code></pre> <p>We can immediately start querying it like</p> <pre><code>presto:rm&gt; select count(*) from wimbledon; _col0 ------- 42295 (1 row) Query 20170713_102300_00023_5achx, FINISHED, 1 node Splits: 18 total, 18 done (100.00%) 0:00 [27 rows, 195KB] [157 rows/s, 1.11MB/s] </code></pre> <p>Remember all the queries are going against Kafka in real time, so the more messages we push, the more results we'll have available. Let's now check what the messages looks like</p> <pre><code>presto:rm&gt; SELECT _message FROM wimbledon LIMIT 5; ----------------------------------------------------------------------------------------------------------------------------------------------------------------- {"created_at":"Thu Jul 13 10:22:46 +0000 2017","id":885444381767081984,"id_str":"885444381767081984","text":"RT @paganrunes: Ian McKellen e Maggie Smith a Wimbl {"created_at":"Thu Jul 13 10:22:46 +0000 2017","id":885444381913882626,"id_str":"885444381913882626","text":"@tomasberdych spricht vor dem @Wimbledon-Halbfinal {"created_at":"Thu Jul 13 10:22:47 +0000 2017","id":885444388645740548,"id_str":"885444388645740548","text":"RT @_JamieMac_: Sir Andrew Murray is NOT amused wit {"created_at":"Thu Jul 13 10:22:49 +0000 2017","id":885444394404503553,"id_str":"885444394404503553","text":"RT @IBM_UK_news: What does it take to be a #Wimbled {"created_at":"Thu Jul 13 10:22:50 +0000 2017","id":885444398929989632,"id_str":"885444398929989632","text":"RT @PakkaTollywood: Roger Federer Into Semifinals \ (5 rows) </code></pre> <p>As expected tweets are stored in JSON format, We can now use the <a href="http://www.rittmanmead.com/blog/2017/07/analyzing-wimbledon-twitter-feeds-in-real-time-with-kafka-presto-and-oracle-dvd-v3/JSON Functions and Operators">Presto JSON functions</a> to extract the relevant informations from it. In the following we're extracting the <code>user.name</code> part of every tweet. Node the <code>LIMIT 10</code> (common among all the SQL-on-Hadoop technologies) to limit the number of rows returned.</p> <pre><code>presto:rm&gt; SELECT json_extract_scalar(_message, '$.user.name') FROM wimbledon LIMIT 10; _col0 --------------------- pietre -- BLICK Sport Neens Hugh Leonard ••••Teju KaLion•••• Charlie Murray Alex The Daft Duck. Hotstar Raj Singh Chandel (10 rows) </code></pre> <p>We can also create summaries like the top 10 users by number of tweets.</p> <pre><code>presto:rm&gt; SELECT json_extract_scalar(_message, '$.user.name') as screen_name, count(json_extract_scalar(_message, '$.id')) as nr FROM wimbledon GROUP BY json_extract_scalar(_message, '$.user.name') ORDER BY count(json_extract_scalar(_message, '$.id')) desc LIMIT 10; screen_name | nr ---------------------+----- Evarie Balan | 125 The Master Mind | 104 Oracle Betting | 98 Nichole | 85 The K - Man | 75 Kaciekulasekran | 73 vientrainera | 72 Deporte Esp | 66 Lucas Mc Corquodale | 64 Amal | 60 (10 rows) </code></pre> <h2 id="addingadescriptionfile">Adding a Description file</h2> <p>We saw above that it's possible to query with ANSI SQL statements using the Presto JSON function. The next step will be to define a structure on top of the data stored in the Kafka topic to turn raw data in a table format. We can achieve this by writing a <strong>topic description file</strong>. The file must be in json format and stored under the <code>etc/kafka</code> folder; it is recommended, but not necessary, that the name of the file matches the kafka topic (in our case <code>rm.wimbledon</code>). The file in our case would be the following</p> <pre><code>{ "tableName": "wimbledon", "schemaName": "rm", "topicName": "rm.wimbledon", "key": { "dataFormat": "raw", "fields": [ { "name": "kafka_key", "dataFormat": "LONG", "type": "BIGINT", "hidden": "false" } ] }, "message": { "dataFormat": "json", "fields": [ { "name": "created_at", "mapping": "created_at", "type": "TIMESTAMP", "dataFormat": "rfc2822" }, { "name": "tweet_id", "mapping": "id", "type": "BIGINT" }, { "name": "tweet_text", "mapping": "text", "type": "VARCHAR" }, { "name": "user_id", "mapping": "user/id", "type": "VARCHAR" }, { "name": "user_name", "mapping": "user/name", "type": "VARCHAR" }, [...] ] } } </code></pre> <p>After restarting Presto when we execute the <code>DESCRIBE</code> operation we can see all the fields available.</p> <pre><code>presto:rm&gt; describe wimbledon; Column | Type | Extra | Comment -------------------+-----------+-------+--------------------------------------------- kafka_key | bigint | | created_at | timestamp | | tweet_id | bigint | | tweet_text | varchar | | user_id | varchar | | user_name | varchar | | user_screenname | varchar | | user_location | varchar | | user_followers | bigint | | user_time_zone | varchar | | _partition_id | bigint | | Partition Id _partition_offset | bigint | | Offset for the message within the partition _segment_start | bigint | | Segment start offset _segment_end | bigint | | Segment end offset _segment_count | bigint | | Running message count per segment _key | varchar | | Key text _key_corrupt | boolean | | Key data is corrupt _key_length | bigint | | Total number of key bytes _message | varchar | | Message text _message_corrupt | boolean | | Message data is corrupt _message_length | bigint | | Total number of message bytes (21 rows) </code></pre> <p>Now I can use the newly defined columns in my query</p> <pre><code>presto:rm&gt; select created_at, user_name, tweet_text from wimbledon LIMIT 10; </code></pre> <p>and the related results</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/07/Table-Result.png" alt="Analyzing Wimbledon Twitter Feeds in Real Time with Kafka, Presto and Oracle DVD v3"></p> <p>We can always mix defined columns with custom JSON parsing Presto syntax if we need to extract some other fields.</p> <pre><code>select created_at, user_name, json_extract_scalar(_message, '$.user.default_profile') from wimbledon LIMIT 10; </code></pre> <h1 id="oracledatavisualizationdesktop">Oracle Data Visualization Desktop</h1> <p>As mentioned at the beginning of the article, the overall goal was to analyse Wimbledon twitter feed in real time with Oracle Data Visualization Desktop via JDBC, so let's complete the picture!</p> <h2 id="jdbcdrivers">JDBC drivers</h2> <p>First step is to download the Presto JDBC drivers version 0.175, I found them in the <a href="https://mvnrepository.com/artifact/com.facebook.presto/presto-jdbc/0.175">Maven website</a>. I tried also the 0.180 version downloadable directly from <a href="https://prestodb.io/docs/current/installation/jdbc.html">Presto website</a> but I had several errors in the connection. <br> After downloading we need to copy the driver <code>presto-jdbc-0.175.jar</code> under the <code>%INSTALL_DIR%\lib</code> folder where <code>%INSTALL_DIR%</code> is the Oracle DVD installation folder and start DVD. Then I just need to create a new connection like the following</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/07/Presto-Connection.png" alt="Analyzing Wimbledon Twitter Feeds in Real Time with Kafka, Presto and Oracle DVD v3"></p> <p>Note that:</p> <ul> <li><strong>URL</strong>: includes also the <code>/kafka</code> postfix, this tells Presto which storage I want to query</li> <li><strong>Driver Class Name</strong>: this setting puzzled me a little bit, I was able to discover the string (with the help of <a href="https://twitter.com/g_ceresa?lang=en">Gianni Ceresa</a>) by concatenating the folder name and the driver class name after unpacking the jar file</li> </ul> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/07/Presto-Class.png" alt="Analyzing Wimbledon Twitter Feeds in Real Time with Kafka, Presto and Oracle DVD v3"></p> <p>** <strong>Username/password</strong>: those strings can be anything since for the basic test we didn't setup any security on Presto.</p> <p>The whole JDBC process setting is described in this <a href="https://www.youtube.com/watch?v=-7UCqvN3P_A&amp;list=PLOcpw36tp3yIb3yobAE1mcxh1P8G3QYlD&amp;index=10">youtube video</a> provided by Oracle. </p> <p>We can then define the source by just selecting the columns we want to import and create few additional ones like the <code>Lat</code> and <code>Long</code> parsing from the <code>coordinates</code> column which is in the form <code>[Lat, Long]</code>. The dataset is now ready to be analysed as we saw at the beginning of the article, with the final result being:</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/07/Presto2-2.gif" alt="Analyzing Wimbledon Twitter Feeds in Real Time with Kafka, Presto and Oracle DVD v3"></p> <h1 id="conclusions">Conclusions</h1> <p>As we can see from the above picture the whole process works (phew....), however it has some limitations: there is no pushdown of functions to the source so most of the queries we see against Presto are in the form of</p> <pre><code>select tweet_text, tweet_id, user_name, created_at from ( select coordinates, coordinates_lat_long, created_at, tweet_id, tweet_text, user_followers, user_id, user_location, user_name, user_screenname, user_time_zone from rm.wimbledon) </code></pre> <p>This means that the whole dataset is retrieved every time making this solution far from optimal for big volumes of data. In those cases probably the "parking" to datastore step would be necessary. Another limitation is related to the transformations, the <code>Lat</code> and <code>Long</code> extractions from <code>coordinates</code> field along with other columns transformations are done directly in DVD, meaning that the formula is applied directly in the visualization phase. In the second post we'll see how the source parsing phase and query performances can be enhanced using <a href="https://www.confluent.io/product/connectors/">Kafka Connect</a>, the framework allowing an easy integration between Kafka and other sources or sinks. </p> <p>One last word: winning Wimbledon eight times, fourteen years after the first victory and five years after the last one it's something impressive! Chapeau mr Federer!</p> Francesco Tisiot ff07563b-90cf-4b3b-b59d-2e2331fd71bb Mon Jul 17 2017 10:09:40 GMT-0400 (EDT) Understanding Data Visualizations: Box Plots https://realtrigeek.com/2017/07/14/understanding-data-visualizations-box-plots/ <p>If you work with lots of points of data that have similar attributes, but the values are all over the place, a box plot could be a best friend. What is a box plot? A box plot is a data visualization that visually shows you the distribution of your data for given attributes. A box plot is sometimes called a cat and whiskers because the min and max look like whiskers to the body which represents the middle 50% of data. If you have lots of data, you probably want to quickly know how the data looks as a whole, and this is the perfect visualization for that task. Let me explain…</p> <p>Below we have a box plot (thank you netuitive.com for the picture). For a given set of numeric data, you will have a minimum and maximum value. These are represented by the end of the visualization. The actual box represents the middle 50% data, with the median value of the data represented by the line in the box. The value between the minimum and edge of the box represents the lower quartile of data, and the max to the edge of the box represents the upper quartile of data. If any values are plotted outside this chart, the values are considered outliers.</p> <p>A lot of information in one graphic, huh? That’s why it’s one of my favorites!</p> <p><img data-attachment-id="1841" data-permalink="https://realtrigeek.com/2017/07/14/understanding-data-visualizations-box-plots/1-11/" data-orig-file="https://epmqueen.files.wordpress.com/2017/07/11.jpg?w=840" data-orig-size="554,217" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="1" data-image-description="" data-medium-file="https://epmqueen.files.wordpress.com/2017/07/11.jpg?w=840?w=300" data-large-file="https://epmqueen.files.wordpress.com/2017/07/11.jpg?w=840?w=554" class=" size-full wp-image-1841 aligncenter" src="https://epmqueen.files.wordpress.com/2017/07/11.jpg?w=840" alt="1" srcset="https://epmqueen.files.wordpress.com/2017/07/11.jpg 554w, https://epmqueen.files.wordpress.com/2017/07/11.jpg?w=150 150w, https://epmqueen.files.wordpress.com/2017/07/11.jpg?w=300 300w" sizes="(max-width: 554px) 100vw, 554px" /></p> <p>Let’s take an example I came up with for Data Visualization. I downloaded some data around running events for the Summer Olympics. I was curious as to what the marathon times look like comparing women and men. Here is the first box plot comparing the data:</p> <p><img data-attachment-id="1842" data-permalink="https://realtrigeek.com/2017/07/14/understanding-data-visualizations-box-plots/2-13/" data-orig-file="https://epmqueen.files.wordpress.com/2017/07/21.jpg?w=840" data-orig-size="1709,851" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="2" data-image-description="" data-medium-file="https://epmqueen.files.wordpress.com/2017/07/21.jpg?w=840?w=300" data-large-file="https://epmqueen.files.wordpress.com/2017/07/21.jpg?w=840?w=840" class=" size-full wp-image-1842 aligncenter" src="https://epmqueen.files.wordpress.com/2017/07/21.jpg?w=840" alt="2" srcset="https://epmqueen.files.wordpress.com/2017/07/21.jpg?w=840 840w, https://epmqueen.files.wordpress.com/2017/07/21.jpg?w=1680 1680w, https://epmqueen.files.wordpress.com/2017/07/21.jpg?w=150 150w, https://epmqueen.files.wordpress.com/2017/07/21.jpg?w=300 300w, https://epmqueen.files.wordpress.com/2017/07/21.jpg?w=768 768w, https://epmqueen.files.wordpress.com/2017/07/21.jpg?w=1024 1024w" sizes="(max-width: 840px) 100vw, 840px" /></p> <p>We can see that there is a bit wider range in the last quartile of marathon times for women than men, but overall it seems that the times are about the same distribution. Recall that the dots are outliers in the data over the years.</p> <p>I was curious as to specific years, broken down by gender. Note that women did not have the option to run the marathon at the Olympics until 1984, so there is no data for them in years preceding 1984. If we take a look at the men first, we can see yearly breakdowns. (I should note here that the data file only contained the gold, silver, and bronze medalists, so not all times are included. It also changes the visualization a bit because there should really be more than 3 data recordings for this graph to “work”, but I thought the data was different and interesting.)</p> <p><img data-attachment-id="1843" data-permalink="https://realtrigeek.com/2017/07/14/understanding-data-visualizations-box-plots/3-12/" data-orig-file="https://epmqueen.files.wordpress.com/2017/07/31.jpg?w=840" data-orig-size="1696,837" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="3" data-image-description="" data-medium-file="https://epmqueen.files.wordpress.com/2017/07/31.jpg?w=840?w=300" data-large-file="https://epmqueen.files.wordpress.com/2017/07/31.jpg?w=840?w=840" class=" size-full wp-image-1843 aligncenter" src="https://epmqueen.files.wordpress.com/2017/07/31.jpg?w=840" alt="3" srcset="https://epmqueen.files.wordpress.com/2017/07/31.jpg?w=840 840w, https://epmqueen.files.wordpress.com/2017/07/31.jpg?w=1680 1680w, https://epmqueen.files.wordpress.com/2017/07/31.jpg?w=150 150w, https://epmqueen.files.wordpress.com/2017/07/31.jpg?w=300 300w, https://epmqueen.files.wordpress.com/2017/07/31.jpg?w=768 768w, https://epmqueen.files.wordpress.com/2017/07/31.jpg?w=1024 1024w" sizes="(max-width: 840px) 100vw, 840px" /></p> <p>We can see in some years, the finish times were much more spread out than others. Also, that over time, the marathon times have gotten faster and the top 3 finishers cross the line closer to each other. It is odd that although the general trend is down for finish times, there are a couple spikes for data. I was curious if this had to do with temperature and/or elevation at all, so I added both to my marathon data (thank you, Google), and plotted it along the bottom in years. (Note that I could not get temperature data all the way back, so I started where I had data to work with in the set. Also, to get both temperature and elevation on the same graph, I enabled a 2<sup>nd</sup> Y axis for the elevation data.)</p> <p>Originally I had seen that 1968 was a strange year in that it had quite a jump. We can see below that that spike in finish times is likely due to the elevation – 7400’!</p> <p><img data-attachment-id="1844" data-permalink="https://realtrigeek.com/2017/07/14/understanding-data-visualizations-box-plots/4-12/" data-orig-file="https://epmqueen.files.wordpress.com/2017/07/41.jpg?w=840" data-orig-size="1712,848" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="4" data-image-description="" data-medium-file="https://epmqueen.files.wordpress.com/2017/07/41.jpg?w=840?w=300" data-large-file="https://epmqueen.files.wordpress.com/2017/07/41.jpg?w=840?w=840" class=" size-full wp-image-1844 aligncenter" src="https://epmqueen.files.wordpress.com/2017/07/41.jpg?w=840" alt="4" srcset="https://epmqueen.files.wordpress.com/2017/07/41.jpg?w=840 840w, https://epmqueen.files.wordpress.com/2017/07/41.jpg?w=1680 1680w, https://epmqueen.files.wordpress.com/2017/07/41.jpg?w=150 150w, https://epmqueen.files.wordpress.com/2017/07/41.jpg?w=300 300w, https://epmqueen.files.wordpress.com/2017/07/41.jpg?w=768 768w, https://epmqueen.files.wordpress.com/2017/07/41.jpg?w=1024 1024w" sizes="(max-width: 840px) 100vw, 840px" /></p> <p>With women filtered out for the years they ran the marathon, we can see there is a spike for 1992, but our available data does not show a possible reason. (Of course, I researched this… I saw that the marathon started at 6:30 PM in 1992. As a marathoner myself, I can’t imagine trying to stay rested and relaxed, planning how to eat during the day, and starting a race in the EVENING. Most of our races start before 7 AM!)</p> <p><img data-attachment-id="1845" data-permalink="https://realtrigeek.com/2017/07/14/understanding-data-visualizations-box-plots/5-13/" data-orig-file="https://epmqueen.files.wordpress.com/2017/07/51.jpg?w=840" data-orig-size="1709,849" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="5" data-image-description="" data-medium-file="https://epmqueen.files.wordpress.com/2017/07/51.jpg?w=840?w=300" data-large-file="https://epmqueen.files.wordpress.com/2017/07/51.jpg?w=840?w=840" class=" size-full wp-image-1845 aligncenter" src="https://epmqueen.files.wordpress.com/2017/07/51.jpg?w=840" alt="5" srcset="https://epmqueen.files.wordpress.com/2017/07/51.jpg?w=840 840w, https://epmqueen.files.wordpress.com/2017/07/51.jpg?w=1680 1680w, https://epmqueen.files.wordpress.com/2017/07/51.jpg?w=150 150w, https://epmqueen.files.wordpress.com/2017/07/51.jpg?w=300 300w, https://epmqueen.files.wordpress.com/2017/07/51.jpg?w=768 768w, https://epmqueen.files.wordpress.com/2017/07/51.jpg?w=1024 1024w" sizes="(max-width: 840px) 100vw, 840px" /></p> <p>While this has nothing to do with box plots, but because I think maps are neat, I wanted to plot the temperature for each Olympics and the average time for the marathon. I split it up by gender to see the results. I found it interesting that the men’s time was faster in 1992 than women’s visually.</p> <p><img data-attachment-id="1846" data-permalink="https://realtrigeek.com/2017/07/14/understanding-data-visualizations-box-plots/6-12/" data-orig-file="https://epmqueen.files.wordpress.com/2017/07/61.jpg?w=840" data-orig-size="1385,850" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="6" data-image-description="" data-medium-file="https://epmqueen.files.wordpress.com/2017/07/61.jpg?w=840?w=300" data-large-file="https://epmqueen.files.wordpress.com/2017/07/61.jpg?w=840?w=840" class=" size-full wp-image-1846 aligncenter" src="https://epmqueen.files.wordpress.com/2017/07/61.jpg?w=840" alt="6" srcset="https://epmqueen.files.wordpress.com/2017/07/61.jpg?w=840 840w, https://epmqueen.files.wordpress.com/2017/07/61.jpg?w=150 150w, https://epmqueen.files.wordpress.com/2017/07/61.jpg?w=300 300w, https://epmqueen.files.wordpress.com/2017/07/61.jpg?w=768 768w, https://epmqueen.files.wordpress.com/2017/07/61.jpg?w=1024 1024w, https://epmqueen.files.wordpress.com/2017/07/61.jpg 1385w" sizes="(max-width: 840px) 100vw, 840px" /></p> <p>So, to take one from Sheldon’s “Fun with Flags”, I hope you learned little in my “Fun with Box Plots” today.</p> <p>PS – I have also included the data set if you want to see other track distance stats, too: <a title="OlympicTrackStats" href="https://epmqueen.files.wordpress.com/2017/07/olympictrackstats.xlsx">OlympicTrackStats </a></p> <p>&nbsp;</p><br /> <a rel="nofollow" href="http://feeds.wordpress.com/1.0/gocomments/epmqueen.wordpress.com/1839/"><img alt="" border="0" src="http://feeds.wordpress.com/1.0/comments/epmqueen.wordpress.com/1839/" /></a> <img alt="" border="0" src="https://pixel.wp.com/b.gif?host=realtrigeek.com&#038;blog=70089387&#038;post=1839&#038;subd=epmqueen&#038;ref=&#038;feed=1" width="1" height="1" /> Sarah Craynon Zumbrum http://realtrigeek.com/?p=1839 Fri Jul 14 2017 14:10:56 GMT-0400 (EDT) Understanding Data Visualizations: Candlesticks https://realtrigeek.com/2017/07/13/understanding-data-visualizations-candlesticks/ <p>There are some really neat new visualizations available on the <a href="http://www.oracle.com/webfolder/technetwork/OracleAnalyticStore/index.html">Oracle Analytics Store</a> (formerly Oracle BI Public Store; new link, as well) that I hope you are following if you are a Data Visualization user. …But it hit me a couple weeks ago that there are some more complex visualizations that an end user may not understand. So, here I am with a new series explaining how different visualizations are used.</p> <p>The first visualization I am going to walk you through is the Candlestick chart. If you are a pricing or financial analyst (especially with daily changes in prices), this visualization should be on your go-to list for analysis. So, what is a candlestick chart? A candlestick chart excels at visualizing price movements over time. It’s kind of a combination of a line and bar chart that gives you 4 pieces of information: Open Price, Close Price, High, and Low. Although they look a lot like boxplots, they are not the same. Candlesticks are used to visualize price and currency patterns over time.</p> <p>Let’s look at an example…</p> <p>I’ve loaded 4 years of Oracle stock data that includes the daily open, close, high, low, and volume. If I use traditional visualizations, I get busy screens and compressed information that doesn’t tell the whole story very well.</p> <p>The first victim is the standard line chart. Line charts are great for comparing numerical values along a timeline. In the visualization below, I enabled a 2<sup>nd</sup> Y axis to allow the visualization to show volume on a larger scale than stock price. This tells us quite a bit, but not details of each day’s activity.</p> <p><img data-attachment-id="1824" data-permalink="https://realtrigeek.com/2017/07/13/understanding-data-visualizations-candlesticks/1-10/" data-orig-file="https://epmqueen.files.wordpress.com/2017/07/1.jpg?w=840" data-orig-size="1622,842" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="1" data-image-description="" data-medium-file="https://epmqueen.files.wordpress.com/2017/07/1.jpg?w=840?w=300" data-large-file="https://epmqueen.files.wordpress.com/2017/07/1.jpg?w=840?w=840" class=" size-full wp-image-1824 aligncenter" src="https://epmqueen.files.wordpress.com/2017/07/1.jpg?w=840" alt="1" srcset="https://epmqueen.files.wordpress.com/2017/07/1.jpg?w=840 840w, https://epmqueen.files.wordpress.com/2017/07/1.jpg?w=150 150w, https://epmqueen.files.wordpress.com/2017/07/1.jpg?w=300 300w, https://epmqueen.files.wordpress.com/2017/07/1.jpg?w=768 768w, https://epmqueen.files.wordpress.com/2017/07/1.jpg?w=1024 1024w, https://epmqueen.files.wordpress.com/2017/07/1.jpg 1622w" sizes="(max-width: 840px) 100vw, 840px" /></p> <p>The second victim is the combo chart. While not too different from the line chart, sometimes showing area (or perhaps a bar chart) for one of the data elements tells a strong story. Again, I utilized a 2<sup>nd</sup> Y axis for volume. While still telling the full story, it’s compressed and hard to decipher. Now, I could add a filter to pare down to a certain date range, but the visualization is still too busy to make sound decisions.</p> <p><img data-attachment-id="1825" data-permalink="https://realtrigeek.com/2017/07/13/understanding-data-visualizations-candlesticks/2-12/" data-orig-file="https://epmqueen.files.wordpress.com/2017/07/2.jpg?w=840" data-orig-size="1610,837" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="2" data-image-description="" data-medium-file="https://epmqueen.files.wordpress.com/2017/07/2.jpg?w=840?w=300" data-large-file="https://epmqueen.files.wordpress.com/2017/07/2.jpg?w=840?w=840" class=" size-full wp-image-1825 aligncenter" src="https://epmqueen.files.wordpress.com/2017/07/2.jpg?w=840" alt="2" srcset="https://epmqueen.files.wordpress.com/2017/07/2.jpg?w=840 840w, https://epmqueen.files.wordpress.com/2017/07/2.jpg?w=150 150w, https://epmqueen.files.wordpress.com/2017/07/2.jpg?w=300 300w, https://epmqueen.files.wordpress.com/2017/07/2.jpg?w=768 768w, https://epmqueen.files.wordpress.com/2017/07/2.jpg?w=1024 1024w, https://epmqueen.files.wordpress.com/2017/07/2.jpg 1610w" sizes="(max-width: 840px) 100vw, 840px" /></p> <p>The third victim is a calendar heatmap. The heat map is great at showing cyclical trends visually. Below, we can see that the close price has “warmed up” or increased over some stale points. However, we cannot compare that to the stock open, close, high, or low for each day. We have no clue as to certain day or days of volatility or if that volatility is cyclical, or monthly, quarterly, or yearly trends. Again, not the whole picture.</p> <p><img data-attachment-id="1826" data-permalink="https://realtrigeek.com/2017/07/13/understanding-data-visualizations-candlesticks/3-11/" data-orig-file="https://epmqueen.files.wordpress.com/2017/07/3.jpg?w=840" data-orig-size="1590,849" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="3" data-image-description="" data-medium-file="https://epmqueen.files.wordpress.com/2017/07/3.jpg?w=840?w=300" data-large-file="https://epmqueen.files.wordpress.com/2017/07/3.jpg?w=840?w=840" class=" size-full wp-image-1826 aligncenter" src="https://epmqueen.files.wordpress.com/2017/07/3.jpg?w=840" alt="3" srcset="https://epmqueen.files.wordpress.com/2017/07/3.jpg?w=840 840w, https://epmqueen.files.wordpress.com/2017/07/3.jpg?w=150 150w, https://epmqueen.files.wordpress.com/2017/07/3.jpg?w=300 300w, https://epmqueen.files.wordpress.com/2017/07/3.jpg?w=768 768w, https://epmqueen.files.wordpress.com/2017/07/3.jpg?w=1024 1024w, https://epmqueen.files.wordpress.com/2017/07/3.jpg 1590w" sizes="(max-width: 840px) 100vw, 840px" /></p> <p>Finally, we have the candlestick chart. The first thing we see is how clean the canvas is compared to the previous charts. Let’s dig into the pieces of the candlestick chart.</p> <p><img data-attachment-id="1827" data-permalink="https://realtrigeek.com/2017/07/13/understanding-data-visualizations-candlesticks/4-11/" data-orig-file="https://epmqueen.files.wordpress.com/2017/07/4.jpg?w=840" data-orig-size="1623,848" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="4" data-image-description="" data-medium-file="https://epmqueen.files.wordpress.com/2017/07/4.jpg?w=840?w=300" data-large-file="https://epmqueen.files.wordpress.com/2017/07/4.jpg?w=840?w=840" class=" size-full wp-image-1827 aligncenter" src="https://epmqueen.files.wordpress.com/2017/07/4.jpg?w=840" alt="4" srcset="https://epmqueen.files.wordpress.com/2017/07/4.jpg?w=840 840w, https://epmqueen.files.wordpress.com/2017/07/4.jpg?w=150 150w, https://epmqueen.files.wordpress.com/2017/07/4.jpg?w=300 300w, https://epmqueen.files.wordpress.com/2017/07/4.jpg?w=768 768w, https://epmqueen.files.wordpress.com/2017/07/4.jpg?w=1024 1024w, https://epmqueen.files.wordpress.com/2017/07/4.jpg 1623w" sizes="(max-width: 840px) 100vw, 840px" /></p> <p>Let’s break down the following slice of the screen:</p> <p><img data-attachment-id="1828" data-permalink="https://realtrigeek.com/2017/07/13/understanding-data-visualizations-candlesticks/5-12/" data-orig-file="https://epmqueen.files.wordpress.com/2017/07/5.jpg?w=840" data-orig-size="162,386" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="5" data-image-description="" data-medium-file="https://epmqueen.files.wordpress.com/2017/07/5.jpg?w=840?w=126" data-large-file="https://epmqueen.files.wordpress.com/2017/07/5.jpg?w=840?w=162" class=" size-full wp-image-1828 aligncenter" src="https://epmqueen.files.wordpress.com/2017/07/5.jpg?w=840" alt="5" srcset="https://epmqueen.files.wordpress.com/2017/07/5.jpg 162w, https://epmqueen.files.wordpress.com/2017/07/5.jpg?w=63 63w" sizes="(max-width: 162px) 100vw, 162px" /></p> <ol> <li>Here’s where the candlestick label originates. The light gray bar represents the high and low prices of the day, showing the range of prices for that day. The top orange or dark gray bar represents the open and close prices. If the bar is orange, then the close price decreased from the previous day; dark gray means the close price increased. Clearly, some days showed more volatility than others.</li> <li>The second chart in the visualization shows the volume for the day directly below the candlestick. This is great for determining volume cycles over time if there are any.</li> <li>Shows the high price along with a calendar.</li> </ol> <p>Note a couple visualization pieces…</p> <p>See the box in the lower right-hand corner? This is a sliding bar that you can use to increase or decrease the days shown on the canvas.</p> <p><img data-attachment-id="1829" data-permalink="https://realtrigeek.com/2017/07/13/understanding-data-visualizations-candlesticks/6-11/" data-orig-file="https://epmqueen.files.wordpress.com/2017/07/6.jpg?w=840" data-orig-size="230,206" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="6" data-image-description="" data-medium-file="https://epmqueen.files.wordpress.com/2017/07/6.jpg?w=840?w=230" data-large-file="https://epmqueen.files.wordpress.com/2017/07/6.jpg?w=840?w=230" class=" size-full wp-image-1829 aligncenter" src="https://epmqueen.files.wordpress.com/2017/07/6.jpg?w=840" alt="6" srcset="https://epmqueen.files.wordpress.com/2017/07/6.jpg 230w, https://epmqueen.files.wordpress.com/2017/07/6.jpg?w=150 150w" sizes="(max-width: 230px) 100vw, 230px" /></p> <p>In the upper right-hand corner, the visualization shows the full date range shown as well as the price change over that timeframe.</p> <p><img data-attachment-id="1830" data-permalink="https://realtrigeek.com/2017/07/13/understanding-data-visualizations-candlesticks/7-10/" data-orig-file="https://epmqueen.files.wordpress.com/2017/07/7.jpg?w=840" data-orig-size="287,81" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="7" data-image-description="" data-medium-file="https://epmqueen.files.wordpress.com/2017/07/7.jpg?w=840?w=287" data-large-file="https://epmqueen.files.wordpress.com/2017/07/7.jpg?w=840?w=287" class=" size-full wp-image-1830 aligncenter" src="https://epmqueen.files.wordpress.com/2017/07/7.jpg?w=840" alt="7" srcset="https://epmqueen.files.wordpress.com/2017/07/7.jpg 287w, https://epmqueen.files.wordpress.com/2017/07/7.jpg?w=150 150w" sizes="(max-width: 287px) 100vw, 287px" /></p> <p>Now you understand how to read and use candlestick charts!</p><br /> <a rel="nofollow" href="http://feeds.wordpress.com/1.0/gocomments/epmqueen.wordpress.com/1819/"><img alt="" border="0" src="http://feeds.wordpress.com/1.0/comments/epmqueen.wordpress.com/1819/" /></a> <img alt="" border="0" src="https://pixel.wp.com/b.gif?host=realtrigeek.com&#038;blog=70089387&#038;post=1819&#038;subd=epmqueen&#038;ref=&#038;feed=1" width="1" height="1" /> Sarah Craynon Zumbrum http://realtrigeek.com/?p=1819 Thu Jul 13 2017 14:52:19 GMT-0400 (EDT) Using Smart View with an Oracle EPM (Hyperion) Planning or Planning and Budgeting Cloud Service (PBCS) Application http://blog.performancearchitects.com/wp/2017/07/12/using-smart-view-with-an-oracle-epm-hyperion-planning-or-planning-and-budgeting-cloud-service-pbcs-application/ <p>Author: Ben Hogle, Performance Architects<strong> </strong></p> <p><a href="http://www.oracle.com/technetwork/middleware/smart-view-for-office/overview/index.html">Oracle Smart View for Office</a> (Smart View) can be immensely powerful, convenient, and very useful for <a href="http://www.oracle.com/technetwork/middleware/planning/overview/index.html">Hyperion Planning</a> on-premise or <a href="https://cloud.oracle.com/en_US/planning-and-budgeting-cloud">Oracle Planning and Budgeting Cloud Service (PBCS)</a> users. Smart View allows end users to retrieve their planning data via an add-on to <a href="https://www.office.com/">MS Office</a> for PowerPoint, Word and Excel.</p> <p>Most end users will use Smart View to see their raw data and drill into it from top to bottom; to add new members; to change parents of dimensions; or to have a user-friendly view of their forms and reports outside of the user interface. This blog focuses on quickly establishing a connection to Smart View and creating an ad hoc report.</p> <p><strong>Install Smart View</strong></p> <p>To install Smart View, you need to download the add-on from either on-premise Hyperion Planning or PBCS. I’ve added a screenshot of where to go for each install.</p> <p>On-premise:</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben1.png"><img class="alignnone size-medium wp-image-2074" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben1-300x169.png" alt="" width="300" height="169" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben1-300x169.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben1-768x433.png 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben1-624x352.png 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben1.png 841w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p>PBCS:</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben2.png"><img class="alignnone size-medium wp-image-2073" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben2-300x182.png" alt="" width="300" height="182" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben2-300x182.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben2-624x379.png 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben2.png 634w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben3.png"><img class="alignnone size-medium wp-image-2072" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben3-300x180.png" alt="" width="300" height="180" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben3-300x180.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben3-768x461.png 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben3-1024x615.png 1024w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben3-624x375.png 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben3.png 1251w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p>Once Smart View is installed on your machine, the Smart View ribbon will appear when you launch Excel, PowerPoint or Word.<strong> </strong></p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben4.png"><img class="alignnone size-medium wp-image-2071" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben4-300x50.png" alt="" width="300" height="50" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben4-300x50.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben4-768x127.png 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben4-624x104.png 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben4.png 976w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p><strong>Set Up Options</strong></p> <p>This is usually a one-time setup, but is important because this connects you to the database and permits you to set preferences that have important effects on your Smart View experience.</p> <p><strong>Establish Connections</strong></p> <p>Press the “Options” icon.  The Options window will open. Select “Advanced” from the left pane:</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben5.png"><img class="alignnone size-full wp-image-2070" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben5.png" alt="" width="210" height="168" /></a></p> <p>In the “General” section of the right pane is a field for inputting the connection string to the “Plan Essbase” cube. You will enter your unique URL into the “Shared Connections URL” field (I have blacked out for confidentiality purposes):</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben6.png"><img class="alignnone size-medium wp-image-2069" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben6-300x122.png" alt="" width="300" height="122" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben6-300x122.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben6.png 623w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p><strong>Connecting to the Data Source</strong></p> <p>Once your options are properly set, you can connect to the database from Excel.</p> <ol> <li>From the Smart View ribbon, press the “Panel” icon:<br /> <a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben7.png"><img class="alignnone size-full wp-image-2068" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben7.png" alt="" width="51" height="66" /></a></li> </ol> <ol start="2"> <li>The Smart View connection panel will appear on the right side of the Excel worksheet. From the Smart View panel, select “Shared Connections:”<br /> <a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben8.png"><img class="alignnone size-full wp-image-2067" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben8.png" alt="" width="295" height="121" /></a></li> </ol> <p>You will be prompted for your on-premise or PBCS user ID and password in a pop-up window.  Enter them here and press the “OK” button.</p> <ol start="3"> <li>Expand the “Server Tree” and select an application (ours is called ‘Plan’ in this example):<br /> <a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben9.png"><img class="alignnone size-full wp-image-2066" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben9.png" alt="" width="173" height="131" /></a></li> </ol> <ol start="4"> <li>At the bottom of the Smart View pane, select “Connect:”<br /> <a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben10.png"><img class="alignnone size-full wp-image-2065" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben10.png" alt="" width="198" height="38" /></a></li> </ol> <ol start="5"> <li>At the bottom of the Smart View panel, select “Ad hoc analysis:”<br /> <a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben11.png"><img class="alignnone size-full wp-image-2064" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben11.png" alt="" width="225" height="96" /></a></li> </ol> <ol start="6"> <li>The Excel worksheet will be populated with the dimensions in the database:<br /> <a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben12.png"><img class="alignnone size-medium wp-image-2063" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben12-300x101.png" alt="" width="300" height="101" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben12-300x101.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben12.png 578w" sizes="(max-width: 300px) 100vw, 300px" /></a></li> </ol> <p><strong>Change the Point of View (POV) Selections</strong></p> <p>To change a selected member in the POV Selector:</p> <ol> <li>Select the down arrow next to the dimension to change:<br /> <a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben13.png"><img class="alignnone size-full wp-image-2062" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben13.png" alt="" width="156" height="164" /></a></li> <li>Select the ellipsis (…)<br /> <a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben14.png"><img class="alignnone size-full wp-image-2061" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben14.png" alt="" width="112" height="140" /></a></li> </ol> <ol start="3"> <li>A selection box will appear:<br /> <a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben15.png"><img class="alignnone size-medium wp-image-2060" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben15-300x154.png" alt="" width="300" height="154" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben15-300x154.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben15-624x319.png 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben15.png 711w" sizes="(max-width: 300px) 100vw, 300px" /></a></li> </ol> <ol start="4"> <li>Press the Refresh button in the POV selector and the data in the Excel grid will change to match your POV selection:<br /> <a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben16.png"><img class="alignnone size-full wp-image-2059" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben16.png" alt="" width="60" height="27" /></a></li> </ol> <p><strong>Move POV Selections to the Grid</strong></p> <p>You can drag each POV selection to the Excel grid.  This is generally a preference to match your work style, but if you’re going to be doing side-by-side analysis (e.g., comparing two scenarios in the Excel grid such as actuals to budget), this is mandatory since the POV selector can hold only one member for the same scenario at a time.</p> <p>To move the POV selections to the grid:</p> <ol> <li>Select the down arrow next to the dimension you want to move to the Excel grid:<br /> <a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben17.png"><img class="alignnone size-full wp-image-2058" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben17.png" alt="" width="164" height="121" /></a></li> </ol> <ol start="2"> <li>Drag to the grid while depressing the mouse button. The dimension member will now be in the grid. Use the basic Excel features to cut-and-paste until you get the report format you want.<br /> <a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben18.png"><img class="alignnone size-medium wp-image-2057" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben18-300x158.png" alt="" width="300" height="158" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben18-300x158.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben18.png 434w" sizes="(max-width: 300px) 100vw, 300px" /></a></li> </ol> <p>This is the report layout and the members in each axis of the report are defined. All reports are defined in the Excel grid by three elements:</p> <ul> <li>Rows: The x-axis in the Excel grid of the report</li> <li>Columns: The y-axis in the Excel grid of the report</li> <li>Point-of-View: All the other dimensions that define the context of what shows up in the grid</li> </ul> <p>From this standard ad hoc starting point, you can double-click each of the dimension names to drill in one level (this can be done multiple times all the way to level zero); you can click and highlight the dimension name and select “Member Selection” from the toolbar above; or you can click and highlight the dimension name and simply type in the name of the member you’re looking for.</p> <p>Below, I double-clicked “Account,” which expanded out to the next level of the hierarchy below the top of the house “Account” member:</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben19.png"><img class="alignnone size-medium wp-image-2056" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben19-300x206.png" alt="" width="300" height="206" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben19-300x206.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben19.png 451w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p>Next, I highlighted “Account” and clicked “Member Selection:”</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben20.png"><img class="alignnone size-medium wp-image-2055" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben20-300x229.png" alt="" width="300" height="229" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben20-300x229.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben20-768x586.png 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben20-1024x781.png 1024w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben20-624x476.png 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben20.png 1054w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p>Then I began typing the name of an account (when the name shows up in the box, simply click “Refresh”):</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben21.png"><img class="alignnone size-medium wp-image-2054" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben21-300x155.png" alt="" width="300" height="155" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben21-300x155.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben21.png 453w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben22.png"><img class="alignnone size-full wp-image-2053" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/ben22.png" alt="" width="54" height="77" /></a></p> <p>Once you add a member to each of the dimensions (in our example, we had nine dimensions) and refresh, you should be able to see your data and continue to use the double-click or member selection to get to the level of member(s) that you want to see. With a little practice and experimenting with different configurations, you’ll be a Smart View wizard in no time!</p> <p>Several other features of Smart View are immensely helpful for an end user, but I don’t have enough time in one blog entry to discuss them all. Please feel free to reach out to <a href="mailto:communications@performancearchitects.com">communications@performancearchitects.com</a> if you have any questions about using Smart View.</p> Melanie Mathews http://blog.performancearchitects.com/wp/?p=2052 Wed Jul 12 2017 05:39:59 GMT-0400 (EDT) PRECIS R package http://www.oralytics.com/2017/07/precis-r-package.html <p>If you use R then you are very familiar with the SUMMARY function.</p> <p>If you use R then you are very familiar with the name <a href="http://hadley.nz/">Hadley Wickham</a>. He has produced some really cool packages for R.</p> <p>He has produced a new R package and function that complements the commonly used SUMMARY R function.</p> <p>The following outlines how you can install this new R package from GitHub (Hadley's GitHub is <a href="https://github.com/hadley/">https://github.com/hadley/</a>).</p> <p>Install the R devtools package. This will allow you to download the package code from GitHub.</p> <pre><br />install.packages("devtools")<br /></pre> <p>Install the package from Hadley's GitHub repository.</p> <pre><br />devtools::install_github("hadley/precis")<br /></pre> <p>Load the library.</p> <pre><br />library(precis)<br /></pre> <p>The following displays information produced by the SUMMARY and the PRECIS function.</p> <pre><br />> summary(mtcars)<br /> mpg cyl disp hp drat wt <br /> Min. :10.40 Min. :4.000 Min. : 71.1 Min. : 52.0 Min. :2.760 Min. :1.513 <br /> 1st Qu.:15.43 1st Qu.:4.000 1st Qu.:120.8 1st Qu.: 96.5 1st Qu.:3.080 1st Qu.:2.581 <br /> Median :19.20 Median :6.000 Median :196.3 Median :123.0 Median :3.695 Median :3.325 <br /> Mean :20.09 Mean :6.188 Mean :230.7 Mean :146.7 Mean :3.597 Mean :3.217 <br /> 3rd Qu.:22.80 3rd Qu.:8.000 3rd Qu.:326.0 3rd Qu.:180.0 3rd Qu.:3.920 3rd Qu.:3.610 <br /> Max. :33.90 Max. :8.000 Max. :472.0 Max. :335.0 Max. :4.930 Max. :5.424 <br /> qsec vs am gear carb <br /> Min. :14.50 Min. :0.0000 Min. :0.0000 Min. :3.000 Min. :1.000 <br /> 1st Qu.:16.89 1st Qu.:0.0000 1st Qu.:0.0000 1st Qu.:3.000 1st Qu.:2.000 <br /> Median :17.71 Median :0.0000 Median :0.0000 Median :4.000 Median :2.000 <br /> Mean :17.85 Mean :0.4375 Mean :0.4062 Mean :3.688 Mean :2.812 <br /> 3rd Qu.:18.90 3rd Qu.:1.0000 3rd Qu.:1.0000 3rd Qu.:4.000 3rd Qu.:4.000 <br /> Max. :22.90 Max. :1.0000 Max. :1.0000 Max. :5.000 Max. :8.000 <br />> precis(mtcars)<br /># data.frame [32 x 11]<br /> name type precis<br /> <chr> <chr> <chr><br />1 mpg dbl 10.4 [ 15.4 ( 19.2) 22.8] 33.9<br />2 cyl dbl 4 (11) 6 (7) 8 (14)<br />3 disp dbl 71.1 [121.0 (196.0) 334.0] 472.0<br />4 hp dbl 52 [ 96 ( 123) 180] 335<br />5 drat dbl 2.76 [ 3.08 ( 3.70) 3.92] 4.93<br />6 wt dbl 1.51 [ 2.54 ( 3.32) 3.65] 5.42<br />7 qsec dbl 14.5 [ 16.9 ( 17.7) 18.9] 22.9<br />8 vs dbl 0 (18) 1 (14)<br />9 am dbl 0 (19) 1 (13)<br />10 gear dbl 3 (15) 4 (12) 5 (5)<br />11 carb dbl 1 [ 2 ( 2) 4] 8<br />> precis(mtcars, histogram=TRUE)<br /># data.frame [32 x 11]<br /> name type precis<br /> <chr> <chr> <chr><br />1 mpg dbl 10.4 ▂▁▇▃▅▅▂▂▁▁▂▂ 33.9<br />2 cyl dbl 4 ▅▁▁▁▁▁▁▁▁▃▁▁▁▁▁▁▁▁▁▇ 8<br />3 disp dbl 71.1 ▅▁▁▃▇▂▁▁▁▁▃▁▃▁▅▁▁▁▁▁▁ 472.0<br />4 hp dbl 52 ▁▅▅▇▂▂▇▁▂▁▂▁▁▁▁ 335<br />5 drat dbl 2.76 ▂▂▇▂▁▅▇▃▂▁▁▁ 4.93<br />6 wt dbl 1.51 ▁▁▂▂▁▁▂▁▂▁▇▂▂▁▁▁▁▁▁▂▁ 5.42<br />7 qsec dbl 14.5 ▂▂▁▁▃▇▅▁▇▂▂▂▁▁▁▁▁ 22.9<br />8 vs dbl 0 ▇▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▅ 1<br />9 am dbl 0 ▇▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▅ 1<br />10 gear dbl 3 ▇▁▁▁▁▁▁▁▁▅▁▁▁▁▁▁▁▁▁▂ 5<br />11 carb dbl 1 ▅▇▁▂▁▇▁▁▁▁▁▁▁▁ 8<br /> <br /></pre> Brendan Tierney tag:blogger.com,1999:blog-4669933501315263808.post-1772713941999106487 Wed Jul 12 2017 05:18:00 GMT-0400 (EDT) PGX – Parallel Graph AnalytiX : Client tools and Languages https://gianniceresa.com/2017/07/pgx-client-tool-language/ <p>Not long ago I wrote a quick introduction on PGX, Parallel Graph AnalytiX, the Oracle graph solution. You can find it here: <a href="https://gianniceresa.com/2017/06/pgx-oracle-graph-analysis-brain/" target="_blank" rel="noopener">PGX &#8211; Parallel Graph AnalytiX : the Oracle graph analysis brain</a>.</p> <p>To use and interact with this graph engine you will need a client or a programming language having an interface to it, and Oracle, wanting to make things nicely, provides multiple options for it.</p> <h2>Choose your preferred solution</h2> <p>If you install PGX itself (which can be done by having Java 1.8 and unzipping the file you get from OTN) you end up with the PGX Shell.<br /> It is probably the simplest way to use PGX as it&#8217;s not just a shell but it&#8217;s PGX itself (when not running it as a server). All the <a href="https://docs.oracle.com/cd/E56133_01/latest/tutorials/index.html" target="_blank" rel="noopener">tutorials </a>have samples providing the PGX Shell code, and it&#8217;s also probably the best way to start with PGX and graphs.</p> <p>If you are using PGX as part of the Oracle Database 12c R2 (what they call the &#8220;Oracle Spatial and Graph&#8221; software package I believe) or the Big Data one (what Oracle calls the &#8220;Oracle Big Data Spatial and Graph&#8221;) you will have a groovy interface. A script named gremlin-opg-*.sh (where the * can be &#8220;rdbms&#8221;, &#8220;hbase&#8221; or &#8220;nosql&#8221; based on the source you want to use to load graphs) is used to start the interactive shell or to execute scripts.</p> <p>If you are a user of notebooks and want to use <a href="https://zeppelin.apache.org/" target="_blank" rel="noopener">Apache Zeppelin</a> nothing could be easier: a Zeppelin interpreter is provided by Oracle. Extremely simple to deploy (unzip and configure following the tutorial). The interpreter gives you the same as PGX Shell in a Zeppelin notebook. This is actually the best option, in my opinion, to start exploring the PGX graphs world as it allows you to easily document what you are doing by adding markdown blocks all around your PGX commands.</p> <div id="attachment_508" style="width: 310px" class="wp-caption aligncenter"><a href="https://gianniceresa.com/wp-content/uploads/2017/07/PGX_client_zeppelin_pgx_interpreter.png"><img class="size-medium wp-image-508" src="https://gianniceresa.com/wp-content/uploads/2017/07/PGX_client_zeppelin_pgx_interpreter-300x185.png" alt="PGX: Zeppelin interpreter - PGX code like PGX Shell" width="300" height="185" srcset="https://gianniceresa.com/wp-content/uploads/2017/07/PGX_client_zeppelin_pgx_interpreter-300x185.png 300w, https://gianniceresa.com/wp-content/uploads/2017/07/PGX_client_zeppelin_pgx_interpreter-768x474.png 768w, https://gianniceresa.com/wp-content/uploads/2017/07/PGX_client_zeppelin_pgx_interpreter-1024x632.png 1024w, https://gianniceresa.com/wp-content/uploads/2017/07/PGX_client_zeppelin_pgx_interpreter-1080x666.png 1080w" sizes="(max-width: 300px) 100vw, 300px" /></a><p class="wp-caption-text">Using the PGX interpreter in Zeppelin you can write the same code as the PGX Shell.</p></div> <p>In addition to the advantages provided by Zeppelin itself the interpreter implemented some Zeppelin visualization to display the result of some commands. For example, when executing a PGQL query the result is automatically visible as a Zeppelin table, allowing you to switch it to a bar chart or few other kinds of visualizations.</p> <div id="attachment_512" style="width: 1034px" class="wp-caption aligncenter"><a href="https://gianniceresa.com/wp-content/uploads/2017/07/PGX_client_zeppelin_pgx_interpreter_table_result.png"><img class="size-large wp-image-512" src="https://gianniceresa.com/wp-content/uploads/2017/07/PGX_client_zeppelin_pgx_interpreter_table_result-1024x396.png" alt="PGX: Zeppelin interpreter - Zeppelin table for PGQL result" width="1024" height="396" srcset="https://gianniceresa.com/wp-content/uploads/2017/07/PGX_client_zeppelin_pgx_interpreter_table_result-1024x396.png 1024w, https://gianniceresa.com/wp-content/uploads/2017/07/PGX_client_zeppelin_pgx_interpreter_table_result-300x116.png 300w, https://gianniceresa.com/wp-content/uploads/2017/07/PGX_client_zeppelin_pgx_interpreter_table_result-768x297.png 768w, https://gianniceresa.com/wp-content/uploads/2017/07/PGX_client_zeppelin_pgx_interpreter_table_result-1080x418.png 1080w" sizes="(max-width: 1024px) 100vw, 1024px" /></a><p class="wp-caption-text">The Zeppelin PGX interpreter provides a table view of a PGQL query result.</p></div> <p>Last option is a programming language: if you want for example to use PGX as part of an existing application.<br /> Java and javascript (Node.js) are provided as <a href="http://www.oracle.com/technetwork/oracle-labs/parallel-graph-analytix/downloads/index.html" target="_blank" rel="noopener">downloads on the OTN page of PGX</a>, in addition Python is available if you have the &#8220;Oracle Big Data Spatial and Graph&#8221; or &#8220;Oracle Spatial and Graph&#8221; package. There inside you will find a Python module for PGX (named pyopg), but this Python module isn&#8217;t available as standalone download so far on OTN.</p> <p>As you can see lot of options, what you will maybe not notice is that all these things use the same way to interact with PGX&#8230;</p> <h2>Java: one API to rule them all!</h2> <p>Everything is done in Java !!</p> <p>PGX Shell is &#8220;just&#8221; the execution of a JAR file linking all the other JARs providing the various functionality.<br /> Groovy is by definition Java and does the same exact thing.</p> <p>The Java library is, obviously, Java itself.</p> <p>Finally, even the Python module provided uses the Java API. The module implements JPype to start a JVM and interact with it by passing commands from python to java and get the result back.</p> <p>Thanks to this Python module you can use PGX from <a href="http://jupyter.org/" target="_blank" rel="noopener">Jupyter Notebook</a>, another well known and common notebook solution.</p> <p>The issue is, based on my experience at least, the provided &#8220;pyopg&#8221; Python module is a bit buggy&#8230;</p> <p>The original version with PGX 2.2.0 was working fine. When updated to PGX 2.4.0, to have support for PGQL, it was impossible to use the Analyst object to execute the embedded algorithms. Python return some Java exception and that&#8217;s it.<br /> That&#8217;s why I gave up the Oracle python module (and also because not available as download on OTN with PGX 2.4.1) and started to write my own code as the module was just doing the interface between Python and the Java API, so a DIY approach provides better control on what is done, where, when and how.</p> <h2>The Java API documentation is your best friend</h2> <p>As said the Java API is the heart of most of the PGX clients, it&#8217;s definitely worth to get familiar with the available classes and methods.</p> <p>Like often with Java APIs, a good Javadoc is what saves you and allow you to get the best out of the API. With PGX it is the same exact thing. The Javadoc is good, cover almost everything (just one object I couldn&#8217;t find but the interface which was implemented in it provided most of the methods I was looking for).</p> <p>You can find the PGX Javadocs at this link: <a href="https://docs.oracle.com/cd/E56133_01/latest/javadocs/index.html" target="_blank" rel="noopener">https://docs.oracle.com/cd/E56133_01/latest/javadocs/index.html</a></p> <blockquote><p><strong>Not everything is implemented &#8230;</strong></p> <p>Important to note that not everything documented in the Javadoc is currently implemented, at least in PGX 2.4.1.<br /> For example when using a ChangeSet it is not possible to add labels on vertices if the graph doesn&#8217;t already have at least a vertex with a property.</p> <p>Same apply to properties: if no property exists you will not be able to add it with a ChangeSet.</p> <p>The ChangeSet will not complain but the newly built graph will not contain it!</p> <p>If you want to load a graph using the PG, flat file, format (.ove &amp; .ovp files), have a look at the &#8220;use_vertex_property_value_as_label&#8221; property of the graph config before you load it. This is supposed to take the value of one of the vertices properties and define it as label for the vertices.</p></blockquote> <h2>Your own Python PGX interface</h2> <p>Using Python is quite easy with PGX. First you need to make sure JPype is available in your setup, you can generally verify and install it if missing with <pre class="crayon-plain-tag">pip install JPype1</pre> .</p> <p>Even though I&#8217;m not a Python developer (but having used few other languages for many years it&#8217;s mainly a matter of adopting a new syntax) I&#8217;m going to release my own version of a small Python class, providing some functions making the interaction with PGX easier and, as bonus, 2 methods for Zeppelin: one will display results of PGQL queries as a Zeppelin table (just like the PGX interpreter does), the second will provide a visualization of the graph using D3js to draw vertices and edges.</p> <div id="attachment_507" style="width: 1034px" class="wp-caption aligncenter"><a href="https://gianniceresa.com/wp-content/uploads/2017/07/PGX_client_zeppelin_graph_d3js.png"><img class="size-large wp-image-507" src="https://gianniceresa.com/wp-content/uploads/2017/07/PGX_client_zeppelin_graph_d3js-1024x601.png" alt="PGX: Zeppelin interpreter - Visualization of the graph with D3js" width="1024" height="601" srcset="https://gianniceresa.com/wp-content/uploads/2017/07/PGX_client_zeppelin_graph_d3js-1024x601.png 1024w, https://gianniceresa.com/wp-content/uploads/2017/07/PGX_client_zeppelin_graph_d3js-300x176.png 300w, https://gianniceresa.com/wp-content/uploads/2017/07/PGX_client_zeppelin_graph_d3js-768x451.png 768w, https://gianniceresa.com/wp-content/uploads/2017/07/PGX_client_zeppelin_graph_d3js-1080x634.png 1080w" sizes="(max-width: 1024px) 100vw, 1024px" /></a><p class="wp-caption-text">Using Python to get the vertices and edges and D3js to build the visualization you can have a view of your graph.</p></div> <p>If you can&#8217;t wait until I will upload on GitHub the code here are the instructions to get you started with Python:</p><pre class="crayon-plain-tag">from jpype import * # build the class path to use for Java, linking all the PGX JARs (download the PGX Java client) pgx_jar_classpath = '... set this variable ...' # start JVM (any other param can be added like TrustStore, KeyStore etc. startJVM(getDefaultJVMPath(), "-ea", "-Djava.class.path=" + pgx_jar_classpath ) pgxClass = JClass('oracle.pgx.api.Pgx') # create a session on a PGX server session = pgxClass.createSession('http://pgx-server:port', 'session-name') # load the graph from disk with a JSON file # important: Json file is accessible by python, graph data file must be accessible by the PGX server graph = session.readGraphWithProperties("path_to_the_json_file.json")</pre><p>Next on the list is the release of 2 simple Docker images for a PGX server and Zeppelin with the PGX interpreter, the simplest way to have a working PGX environment available (using the PGX OTN release, meaning it will not be possible to source your graph from database, nosql or hdfs).</p> <p>The post <a rel="nofollow" href="https://gianniceresa.com/2017/07/pgx-client-tool-language/">PGX – Parallel Graph AnalytiX : Client tools and Languages</a> appeared first on <a rel="nofollow" href="https://gianniceresa.com">Gianni&#039;s world: things crossing my mind</a>.</p> Gianni Ceresa https://gianniceresa.com/?p=504 Wed Jul 12 2017 04:23:25 GMT-0400 (EDT) Streaming Global Cyber Attack Analytics with Tableau and Python http://www.rittmanmead.com/blog/2017/07/tableau-real-time-analytics-with-tableau-and-python/ <img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screenshot-2017-06-08-16.54.34-1.png" alt="Streaming Global Cyber Attack Analytics with Tableau and Python"><p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screenshot-2017-06-08-16.54.34.png" alt="Streaming Global Cyber Attack Analytics with Tableau and Python"> <br> </p> <h2>Introduction and Hacks</h2> <p>As grandiose a notion as the title may imply, there have been some really promising and powerful moves made in the advancement of smoothly integrating real-time and/or streaming data technologies into most any enterprise reporting and analytics architecture. When used in tandem with functional programming languages like Python, we now have the ability to create enterprise grade data engineering scripts to handle the manipulation and flow of data, large or small, for final consumption in all manner of business applications. </p> <p>In this cavalcade of coding, we're going to use a combination of Satori, a free data streaming client, and python to stream live world cyber attack activity via an api. We'll consume the records as json, and then use a few choice python libraries to parse, normalize, and insert the records into a mysql database. Finally, we'll hook it all up to Tableau and watch cyber attacks happen in real time with a really cool visualization.</p> <p><br> </p> <h2> The Specs </h2> <p>For the this exercise, we're going to bite things off a chunk at a time. We're going to utilize a service called <a href="https://www.satori.com/?sort=-created_at&amp;utm_expid=.08UMh4RnQ8SHNc3yk2i52g.1&amp;utm_referrer=https%3A%2F%2Fwww.satori.com%2Foverview">Satori</a>, a streaming data source aggregator that will make it easy for us to hook up to any number of streams to work with as we please. In this case, we'll be working with the Live Cyber Attack Threat Map data set. Next, we'll set up our producer code that will do a couple of things. First it will create the API client from which we will be ingesting a constant flow of cyber attack records. Next, we'll take these records and convert them to a data frame using the Pandas library for python. Finally, we will insert them into a MySQL database. This will allow us to use this live feed as a source for Tableau in order to create a geo mapping of countries that are currently being targeted by cyber attacks.</p> <p><br> </p> <h2> The Data Source </h2> <p><a href="https://www.satori.com/?sort=-created_at&amp;utm_expid=.08UMh4RnQ8SHNc3yk2i52g.1&amp;utm_referrer="><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screenshot-2017-06-08-16.25.00-1.png" alt="Streaming Global Cyber Attack Analytics with Tableau and Python" title=""></a></p> <p>Satori is a new-ish service that aggregates the web's streaming data sources and provides developers with a client and some sample code that they can then use to set up their own live data streams. While your interests may lie in how you can stream your own company's data, it then simply becomes a matter of using python's requests library to get at whatever internal sources you might need. Find more on the requests library <a href="http://docs.python-requests.org/en/master/">here</a>.</p> <p>Satori has taken a lot of the guess work out of the first step of the process for us, as they provide basic code samples in a number of popular languages to access their streaming service and to generate records. You can find the link to this code in a number of popular languages <a href="https://www.satori.com/channels/live-cyber-attack-threat-map">here</a>. Note that you'll need to install their client and get your own app key. I've added a bit of code at the end to handle the insertion of records, and to continue the flow, should any records produce a warning.</p> <p><br> </p> <h3> Satori Code </h3> <pre><code># Imports from __future__ import print_function import sys import threading from pandas import DataFrame from satori.rtm.client import make_client, SubscriptionMode # Local Imports from create_table import engine # Satori Variables channel = "live-cyber-attack-threat-map" endpoint = "wss://open-data.api.satori.com" appkey = " " # Local Variables table = 'hack_attacks' def main(): with make_client( endpoint=endpoint, appkey=appkey) as client: print('Connected!') mailbox = [] got_message_event = threading.Event() class SubscriptionObserver(object): def on_subscription_data(self, data): for message in data['messages']: mailbox.append(message) got_message_event.set() subscription_observer = SubscriptionObserver() client.subscribe( channel, SubscriptionMode.SIMPLE, subscription_observer) if not got_message_event.wait(30): print("Timeout while waiting for a message") sys.exit(1) for message in mailbox: # Create dataframe data = DataFrame([message], columns=['attack_type', 'attacker_ip', 'attack_port', 'latitude2', 'longitude2', 'longitude', 'city_target', 'country_target', 'attack_subtype', 'latitude', 'city_origin', 'country_origin']) # Insert records to table try: data.to_sql(table, engine, if_exists='append') except Exception as e: print(e) if __name__ == '__main__': main() </code></pre> <p><br> </p> <h3> Creating a Table </h3> <p>Now that we've set up the streaming code that we'll use to fill our table, we'll need to set up the table in MySQL to hold them all. For this we'll use the SQLAlchemy ORM (object relational mapper). It's a high falutin' term for a tool that simply abstracts SQL commands to be more 'pythonic'; that is, you don't necessarily have to be a SQL expert to create tables in your given database. Admittedly, it can be a bit daunting to get the hang of, but give it a shot. Many developers choose to interact a with a given database either via direct SQL or using an ORM. It's good practice to use a separate python file, in this case <code>settings.py</code> (or some variation thereof), to hold your database connection string in the following format (the addition of the <code>mysqldb</code> tag at the beginning is as a result of the installation of the mysql library you'll need for python), entitled <code>SQLALCHEMY_DATABASE_URI</code>: </p> <pre><code>'mysql+mysqldb://db_user:pass@db_host/db_name' </code></pre> <p>Don't forget to sign in to your database to validate success!</p> <p><br> </p> <h3> Feeding MySQL and Tableau </h3> <p>Now all we need to do is turn on the hose and watch our table fill up. Running <code>producer.py</code>, we can then open a new tab, log in to our database to make sure our table is being populated, and go to work. Create a new connection to your MySQL database (called my db 'hacks') in Tableau and verify that everything is in order once you navigate to the data preview. There are lots of nulls in this data set, but this will simply be a matter of filtering them out on the front end.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screenshot-2017-06-08-15.54.13.png" alt="Streaming Global Cyber Attack Analytics with Tableau and Python"></p> <p>Tableau should pick up right away on the geo data in the dataset, as denoted by the little globe icon next to the field. <br> <img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screenshot-2017-06-08-17.07.18.png" alt="Streaming Global Cyber Attack Analytics with Tableau and Python"> We can now simply double-click on the corresponding geo data field, in this case we'll be using <code>Country Target</code>, and then the <code>Number of Records</code> field in the Measures area. <br> <img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screenshot-2017-06-08-17.09.25.png" alt="Streaming Global Cyber Attack Analytics with Tableau and Python"> I've chosen to use the 'Dark' map theme for this example as it just really jives with the whole cyber attack, international espionage vibe. Note that you'll need to maintain a live connection, via Tableau, to your datasource and refresh at the interval you'd like, if using Tableau Desktop. If you're curious about how to automagically provide for this functionality, a quick google search will come up with some solutions.</p> Spencer McGhin 722c69a8-2131-41d9-bd09-ec08bc25f5fd Tue Jul 11 2017 11:19:56 GMT-0400 (EDT) Part 3 - Ensuring there is no Discrimination in the Data and machine learning models http://www.oralytics.com/2017/07/part-3-ensuring-there-is-no.html <p>This is the third part of series of blog posts on '<a href="http://www.oralytics.com/2017/06/how-eu-gdpr-will-affect-use-of-machine.html">How the EU GDPR will affect the use of Machine Learning</a>'</p> The new EU GDPR has some new requirements that will affect what data can be used to ensure there is no discrimination. Additionally, the machine learning models needs to ensure that there is no discrimination with the predictions it will make. There is an underlying assumption that the organisation has the right to use the data about individuals and that this data has been legitimately obtained. The following outlines the areas relating to discrimination: <ul> <li>Discrimination based on unfair treatment of an individual based on using certain variables that may be inherently discriminatory. For example, race, gender, etc., and any decisions based on machine learning methods or not, that are based on an individual being part of one or more of these variables. This is particularly challenging for data scientists and it can limit some of the data points that can be included in their data sets.</li> <li>All data mining models need to tested to ensure that there is no discrimination built into them. Although the data scientist has removed any obvious variables that may cause discrimination, the machine learning models may have been able to discover some bias or discrimination based on the patterns it has discovered in the data.</li> <li>In the text preceding the EU GDPR (paragraph 71), details the requirements for data controllers to “implement appropriate technical and organizational measures” that “prevent, inter alia, discriminatory effects” based on sensitive data. Paragraph 71 and Article 22 paragraph 4 addresses discrimination based on profiling (using machine learning and other methods) that uses sensitive data. Care is needed to remove any associated correlated data.</li> <li>If one group of people are under represented in a training data set then, depending on the type of prediction being used, may unknowingly discriminate this group when it comes to making a prediction. The training data sets will need to be carefully partitioned and separate machine learning models built on each partition to ensure that such discrimination does not occur.</li></ul> <p><img src="https://lh3.googleusercontent.com/-7fSzELwBFmE/WVOS0kn16wI/AAAAAAAAMMo/PEUlDmdm_WgH_eHRE5EtUSpFEJb4T6GwgCHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="302" height="154" /></p> <p>In the next blog post I will look at addressing the issues relating to Article 22 on the right to an explanation on outcomes automated individual decision-making, including profiling using machine learning and other methods.</p> <br><p>Click back to '<a href="http://www.oralytics.com/2017/06/how-eu-gdpr-will-affect-use-of-machine.html">How the EU GDPR will affect the use of Machine Learning - Part 1</a>' for links to all the blog posts in this series.</p> Brendan Tierney tag:blogger.com,1999:blog-4669933501315263808.post-692265636654222470 Mon Jul 10 2017 10:55:00 GMT-0400 (EDT) Enabling A Modern Analytics Platform http://www.rittmanmead.com/blog/2017/07/enabling-a-modern-analytics-platform/ <p>Over recent years, bi-modal analytics has gained interest and, dare I say it, a level of notoriety, thanks to Garnter’s repositioning of its Magic Quadrant in 2016. I’m going to swerve the debate, but if you are not up to speed, then I recommend taking a look <a href="https://www.forbes.com/sites/jasonbloomberg/2015/09/26/bimodal-it-gartners-recipe-for-disaster/#2564c64a5dd7">here</a> first. </p> <p>Regardless of your chosen stance on the subject, one thing is certain: the ability to provision analytic capabilities in more agile ways and with greater end user flexibility is now widely accepted as an essential part of any modern analytics architecture.</p> <p>But are there any secrets or clues that could help you in modernising your analytics platform? <br> <br></p> <h2 id="whatisdrivingthebimodalshift">What Is Driving the Bi-Modal Shift?</h2> <p>The demand for greater flexibility from our analytics platforms has its roots in the significant evolutions seen in the businesses environment. Specifically, we are operating in/with:</p> <ul> <li>increasingly competitive marketplaces, requiring novel ideas, more tailored customer relationships and faster decisions;</li> <li>turbulent global economies, leading to a drive to reduce (capex) costs, maximise efficiencies and a need to deal with increased regulation;</li> <li>broader and larger, more complex and more externalised data sets, which can be tapped into with much reduced latency;</li> <li>empowered and tech-savvy departmental users, with an increased appetite for analytical decision making, combined with great advances in data discovery and visualisation technologies to satisfy this appetite;</li> </ul> <p>In a nutshell, the rate at which change occurs is continuing to gather pace and so to be an instigator of change (or even just a reactor to it as it happens around you) requires a new approach to analytics and data delivery and execution.</p> <p><br> </p> <h2 id="timetoheadbacktothedrawingboard">Time to Head Back to the Drawing Board?</h2> <p>Whilst the case for rapid, user-driven analytics is hard to deny, does it mean that our heritage BI and Analytics platforms are obsolete and ready for the scrap heap?</p> <p>I don’t think so: The need to be able to monitor operational processes, manage business performance and plan for the future have not suddenly disappeared; The need for accurate, reliable and trusted data which can be accessed securely and at scale is as relevant now as it was before. And this means that, despite what some might have us believe, all the essential aspects of the enterprise BI platforms we have spent years architecting, building and growing cannot be simply wiped away. </p> <p><em>[Phew!]</em></p> <p>Instead, our modern analytics platforms must embrace both ends of the spectrum equally: highly governed, curated and trustworthy data to support business management and control, coupled with highly available, flexible, loosely governed data to support business innovation. In other words, both modes must coexist and function in a relative balance.</p> <p>The challenge now becomes a very different one: how can we achieve this in an overarching, unified business architecture which supports departmental autonomy, encourages analytical creativity and innovation, whilst minimising inefficiency and friction? Now that is something we can really get our teeth into!</p> <p><br> </p> <h2 id="whatsitallabout">What’s IT All About?</h2> <p>Some questions:</p> <ul> <li>Do you have a myriad of different analytics tools spread across the business <em>which are all being used to fulfil the same ends</em>?</li> <li>Are you constantly being asked to provide data extracts <em>or</em> have you resorted to cloning your production database and provisioning SQL Developer to your departmental analysts?</li> <li>Are you routinely being asked to productionise things that you have absolutely no prior knowledge of?</li> </ul> <p>If you can answer <em>Yes</em> to these questions, then you are probably wrestling with an unmanaged or accidental bi-modal architecture.</p> <p>At Rittman Mead, we have seen several examples of organisations who want to hand greater autonomy to departmental analysts and subject matter experts, so that they can get <em>down and dirty</em> with the data to come up with novel and innovative business ideas. In most of the cases I have observed, this has been driven at a departmental level and instead of IT embracing the movement and leading the charge, results have often been achieved by circumventing IT. Even in the few examples where IT have engaged in the process, the scope of their involvement has normally been focused on the provision of hardware and software, or increasingly, the rental of some cloud resources. It seems to me that the bi-modal shift is often perceived as a threat to traditional IT, that it is somehow the thin end of a wedge leading to full departmental autonomy and no further need for IT! In reality, this has never been (and will never be) the ambition or motivation of departmental initiatives.</p> <p>In my view, this slow and faltering response from IT represents a massive missed opportunity. More importantly though, it increases the probability that the two modes of operation will be addressed in isolation and this will only ever lead to siloed systems, siloed processes and ultimately, a siloed mentality. The creation of false barriers between IT and business departments can never be a positive thing. </p> <p>That’s not to say that there won’t be any positive results arising from un-coordinated initiatives, it’s just that unwittingly, they will cause an imbalance in the overall platform: You might deliver an ultra-slick, flexible, departmentally focused discovery lab, but this will encourage the neglect and stagnation of the enterprise platform. Alternatively, you may have a highly accurate, reliable and performant data architecture with tight governance control which creates road-blocks for departmental use cases.</p> <p><br> </p> <h2 id="findingtherightbalance">Finding the Right Balance</h2> <p>So, are there any smart steps that you can take if you are looking to build out a bi-modal analytics architecture? Well, here are a few ideas that you should consider as factors in a successful evolution:</p> <p><em>1. Appreciate Your Enterprise Data Assets</em></p> <p>You’ve spent a lot of time and effort developing and maintaining your data warehouse and defining the metadata so that it can be exposed in an easily understandable and user friendly way. The scope of your enterprise data also provides a common base for the combined data requirements for <strong>all</strong> of your departmental analysts. Don’t let this valuable asset go to waste! Instead provide a mechanism whereby your departmental analysts can access enterprise data quickly, easily, when needed and as close to the point of consumption as possible. Then, with good quality and commonly accepted data in their hands, give your departmental analysts a level of autonomy and the freedom to cut loose. </p> <p><em>2. Understand That Governance Is Not a Dirty Word</em></p> <p>In many organisations, data governance is synonymous with red tape, bureaucracy and hurdles to access. This should not be the case. Don’t be fooled into thinking that more agile means less control. As data begins to be multi-purposed, moved around the business, combined with disparate external data sources and used to drive creativity in new and innovative ways, it is essential that the provenance of the enterprise data is known and quantifiable. That way, departmental initiatives will start with a level of intrinsic confidence, arising from the knowledge that the base data has been sourced from a well known, consistent and trusted source. Having this bedrock will increase confidence in your analytical outputs and lead to stronger decisions. It will also drive greater efficiencies when it comes to operationalising the results.</p> <p><em>3. Create Interdependencies</em></p> <p>Don’t be drawn into thinking “our Mode 1 solution is working well, so let’s put all our focus and investment into our Mode 2 initiatives”. Instead, build out your Mode 2 architecture with as much integration into your existing enterprise platform as possible. The more interdependencies you can develop, the more you will be able to reduce data handling inefficiencies and increase benefits of scale down the line. Furthermore, interdependency will eliminate the risk of creating silos and allowing your enterprise architecture to stagnate, as both modes will have a level of reliance on one another. It will also encourage good data management practice, with data-workers talking in a common and consistent language.</p> <p><em>4. Make the Transition Simple</em></p> <p>Probably the single most important factor in determining the success of your bi-modal architecture is the quality with which you can transition a Mode 2 model into something operational and production-ready in Mode 1. The more effective this process is, the more likely you are to maximise your opportunities (be it new sales revenue, operating cost etc.) and increase your RoI. The biggest barriers to smoothing this transition will arise when departmental outputs need to be reanalysed, respecified and redesigned so that they can be slotted back into the enterprise platform. If both Mode 1 and Mode 2 activity is achieved with the same tools and software vendors, then you will have a head start…but even if disparate tools are used for the differing purposes, then there are always things that you can do that will help. Firstly, make sure that the owners of the enterprise platform have a level of awareness of departmental initiatives, so that there is a ‘no surprises’ culture…who knows, their experience of the enterprise data could even be exploited to add value to departmental initiatives. Secondly, ensure that departmental outputs can always be traced back to the enterprise data model easily (note: this will come naturally if the other 3 suggestions are followed!). And finally, define a route to production that is not overbearing or cumbersome. Whilst all due diligence should be taken to ensure the production environment is risk-free, creating artificial barriers (such as a quarterly or monthly release cycle) will render a lot of the good work done in Mode 2 useless.</p> Mike Vickers 952777bb-4481-4b53-ab43-04f89c8cc484 Mon Jul 10 2017 10:03:00 GMT-0400 (EDT) ODTUG Kscope17 Award Winners Announced http://www.odtug.com/p/bl/et/blogaid=738&source=1 Congratulations to all of the ODTUG Kscope17 award winners, including Oracle Contributor of the Year, ODTUG Volunteer Award, Innovation Award, Best First-Time Speaker, Best Overall Speaker, Top Speakers by Track, Kscope GO Winners, and the Ambassador Winner. ODTUG http://www.odtug.com/p/bl/et/blogaid=738&source=1 Fri Jul 07 2017 09:10:02 GMT-0400 (EDT) ODTUG Kscope17 Award Winners Announced http://www.kscope18.odtug.com/p/bl/et/blogaid=738&source=1 Congratulations to all of the ODTUG Kscope17 award winners, including Oracle Contributor of the Year, ODTUG Volunteer Award, Innovation Award, Best First-Time Speaker, Best Overall Speaker, Top Speakers by Track, Kscope GO Winners, and the Ambassador Winner. ODTUG http://www.kscope18.odtug.com/p/bl/et/blogaid=738&source=1 Fri Jul 07 2017 09:10:02 GMT-0400 (EDT) Unify - bringing together the best of both worlds http://www.rittmanmead.com/blog/2017/07/unify/ <img src="http://www.rittmanmead.com/blog/content/images/2017/07/Unify-2.png" alt="Unify - bringing together the best of both worlds"><p>Since I started teaching OBIEE in 2011, I had the pleasure of meeting many fascinating people who work with Business Intelligence. </p> <p>In talking to my students, I would generally notice three different situations:</p> <ol> <li><p>Folks were heavy users of OBIEE and just ready to take their skills to the next level.</p></li> <li><p>They were happily transitioning to OBIEE from a legacy reporting tool which didn’t have the power that they needed. </p></li> <li><p>People were being "forced" to transition to OBIEE. They felt that they were moving away from their comfort zone and diving into a world of complicated mappings that would first require them to become rocket scientists. They were resistant to change. </p></li> </ol> <p>It was this more challenging crowd that mostly sparked my interest for other analytics tools. They asked questions like: “Why are we switching to another system? What are the benefits?” </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Unify-1.png" alt="Unify - bringing together the best of both worlds"></p> <p>I wanted to have a good answer to these questions. Over the years, different projects have allowed me the opportunity to work with diverse reporting tools. My students’ questions were always in mind: Why? And what are the benefits? So, I always took the time to compare/contrast the differences between OBIEE and these other tools. </p> <p>I noticed that many of them did a fantastic job at answering the questions needed, and so did OBIEE. It didn’t take me long to have the answer that I needed: the main difference in OBIEE is the RPD! </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Unify-2-1.png" alt="Unify - bringing together the best of both worlds"></p> <p>The RPD is where so much Business Intelligence happens. There, developers spend mind boggling times connecting the data, deriving complex metrics and hierarchies, joining hundreds of tables, and making everything a beautiful drag and drop dream for report writers.</p> <p>Yes, many other tools will allow us to do magic with metadata, but most of them require this magic to be redefined every time we need a new report, or the report has a different criteria. Yes, the RPD requires a lot of work upfront, but that work is good for years to come. We never lose any of our previous work, we just enhance our model. Overtime, the RPD becomes a giant pool of knowledge for a company and is impressively saved as a file. </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Unify-3.png" alt="Unify - bringing together the best of both worlds"></p> <p>For tapping into the RPD metadata, traditionally we have used BI Publisher and OBIEE. They are both very powerful and generally complement each other well. Other tools have become very popular in the past few years. Tableau is an example that quickly won the appreciation of the BI community and has kept consistent leadership in Gartner’s BI Magic quadrant since 2013. With a very slick interface and super fast reporting capability, Tableau introduced less complex methods to create amazing dashboards - and fast! So, what is there not to like? There is really so much TO like! </p> <p>Going back to the comparing and contrasting, the main thing that Tableau doesn’t offer is… the RPD. It lacks a repository with the ability to save the join definitions, calculations and the overall intelligence that can be used for all future reports. </p> <p>At Rittman Mead, we’ve been using these tools and appreciate their substantial capabilities, but we really missed the RPD as a data source. We wanted to come up with a solution that would allow our clients to take advantage of the many hours they had likely already put into metadata modeling by creating a seamless transition from OBIEE’s metadata layer to Tableau.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Unify-4.png" alt="Unify - bringing together the best of both worlds"></p> <p>This past week, I was asked to test our new product, called Unify. Wow. Once again, I am so proud of my fellow coworkers. Unify has a simple interface and uses a Tableau web connector to create a direct line to your OBIEE repository for use in Tableau reports, stories and dashboards.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Unify-5.png" alt="Unify - bringing together the best of both worlds"></p> <p>In Unify, we select the subject areas from our RPD presentation layer and choose our tables and columns as needed. Below is a screenshot of Unify using the OBIEE 12c Sample App environment. If you are not familiar with OBIEE 12c, Oracle provides the Sample App - a standalone virtual image with everything that you need to test the product. You can download the SampleApp here: <a href="http://www.oracle.com/technetwork/middleware/bi-foundation/obiee-samples-167534.html">http://www.oracle.com/technetwork/middleware/bi-foundation/obiee-samples-167534.html</a></p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Unify-6.png" alt="Unify - bringing together the best of both worlds"></p> <p>We are immediately able to leverage all joins, calculated columns, hierarchies, RPD variables, session variables and that’s not all… our RPD security too! Yes, even row level security is respected when we press the “Unify” button and data is brought back into Tableau. So now, there is no reason to lose years of metadata work because one team prefers to visualize with Tableau instead of OBIEE. </p> <p>Unify allows us to import only the data needed for the report, as we can utilize ‘in-tool’ filtering, keeping our query sets small and our performance high.</p> <p>In sum, Unify unites it all - you can have your cake and eat it too. No matter which tool you love the most, add them together and you will certainly love them both more. </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Unify-8.png" alt="Unify - bringing together the best of both worlds"></p> Silvia Rauton c45d1d85-3446-43de-b049-6f652579447c Thu Jul 06 2017 10:00:00 GMT-0400 (EDT) 2017 ODTUG Innovation Award Winner & Honorable Mention http://www.odtug.com/p/bl/et/blogaid=737&source=1 For 2017, member votes were combined with overall judges scoring to determine the Innovation Award winner and Honorable Mention. The winner was announced at the General Session at ODTUG Kscope17.For those that were unable to attend.... drum roll please... ODTUG http://www.odtug.com/p/bl/et/blogaid=737&source=1 Wed Jul 05 2017 10:08:38 GMT-0400 (EDT) Oracle Data Visualization Desktop v3 http://www.rittmanmead.com/blog/2017/07/oracle-data-visualization-desktop-v3-new-features/ <img src="http://www.rittmanmead.com/blog/content/images/2017/07/Boxplot-and-Waterfall-1.png" alt="Oracle Data Visualization Desktop v3"><p>The <a href="http://kscope17.com">ODTUG Kscope17</a> conference last week in San Antonio was a great event with plenty of very interesting sessions and networking opportunities. Rittman Mead participated during the thursday deep dive BI session and delivered <a href="https://www.rittmanmead.com/blog/2017/06/rittman-mead-at-kscope-2017/">three sessions</a> including a special "fishing" one.</p> <p><center> <br> <blockquote class="twitter-tweet" data-conversation="none" data-lang="it"><p lang="und" dir="ltr">. <a href="https://t.co/jC04r4RNvx">pic.twitter.com/jC04r4RNvx</a></p>&mdash; Andrew Fomin (@fomin_andrew) <a href="https://twitter.com/fomin_andrew/status/880084939097743360">28 giugno 2017</a></blockquote> <script async src="//platform.twitter.com/widgets.js" charset="utf-8"></script> <br> </center></p> <p>In the meantime Oracle released <a href="http://www.oracle.com/technetwork/middleware/oracle-data-visualization/downloads/oracle-data-visualization-desktop-2938957.html">Data Visualization Desktop 12.2.3.0.0</a> which was presented in detail during <a href="https://twitter.com/philippe_lions?lang=en">Philippe Lions</a> session and includes a set of new features and enhancements to already existing functionalities. Starting from new datasources, through new visualization options, in this post I'll go in detail on each of them.</p> <h1 id="datasources">Data Sources</h1> <p>The following new datasources have been introduced:</p> <ul> <li>Oracle Docs: Oracle's <a href="https://cloud.oracle.com/content">content sharing cloud product</a></li> <li>OData: <a href="http://www.odata.org">Open Data Protocol</a>, a standard protocol for RESTful APIs.</li> <li>JDBC</li> <li>ODBC</li> </ul> <p>The latter two (still in beta) are very relevant since they enable querying any product directly exposing JDBC or ODBC connectors (like Presto) without needing to wait for the official support in the DVD list of sources. </p> <p>Still in DVD v3 there is no support for JSON or XML files. In <a href="https://www.rittmanmead.com/blog/2016/11/combining-google-analytics-and-json-data-through-apache-drill-in-oracle-data-visualization-desktop/">my older blog post</a> I wrote how JSON (and XML) can be queried in DVD using <a href="https://drill.apache.org">Apache Drill</a>, however this solution has Drill installation and knowledge as a prerequisite which is not always achievable in end users environment where self-service BI is happening. I believe future versions of DVD will address this problem by providing full support to both data sources.</p> <h2 id="connectiontoobiee">Connection to OBIEE</h2> <p>One of the most requested new features is the new interface to connect to OBIEE: until DVD v2 only pre-built OBIEE analysis could be used as sources, with DVD v3 OBIEE Subject Areas are exposed making them accessible. The set of columns and filters can't be retrieved on the fly during the project creation but must be defined upfront during datasource definition. This feature avoids move back and forth from OBIEE to DVD to create an analysis in as datasource, and then use it in DVD.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/07/Subject-Areas.png" alt="Oracle Data Visualization Desktop v3"></p> <p>Another enhancement in the datasource definition is the possibility to change the <strong>column delimiter</strong> in txt sources, useful if the datasource has an unusual delimiters.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/07/Separator.png" alt="Oracle Data Visualization Desktop v3"></p> <h1 id="datapreparation">Data Preparation</h1> <p>On the data-preparation side we have two main enhancements: the convert-to-date and the time grain level. <br> The <strong>convert-to-date</strong> feature enhances ability for columns to date conversion including the usage of custom parsing strings. Still this feature has some limits like not being able to parse dates like <code>04-January-2017</code> where the month name is complete. For this date format a two step approach, reducing the month-name and then converting, is still required.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/07/Convert-To-Date.png" alt="Oracle Data Visualization Desktop v3"></p> <p>The second enhancement in the data preparation side is the <strong>time grain level and format</strong>, those options simplify the extraction of attributes (e.g. Month, Week, Year) from date fields which can now be done visually instead of writing logical SQL.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/07/Date-Extraction.png" alt="Oracle Data Visualization Desktop v3"></p> <p>The <strong>Dataflow</strong> component in DVD v3 has an improved UI with new column merge and aggregation functionalities which makes the flow creation easier. Its output can now be saved as Oracle database or Hive table eliminating the need of storing all the data locally. </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/07/Dataflow.png" alt="Oracle Data Visualization Desktop v3"></p> <p>It's worth mentioning that Dataflow is oriented to self-service data management: any parsing or transformation happens on the machine where DVD is installed and its configuration options are limited. If more robust transformations are needed then proper ETL softwares should be used.</p> <h1 id="newvisualizationoptions">New Visualization Options</h1> <p>There are several enhancement on the visualization side, with the first one being the trendlines <strong>confidence levels</strong> which can be shown, with fixed intervals (90%, 95% or 99%) <br> <img src="http://www.rittmanmead.com/blog/content/images/2017/07/confidence.png" alt="Oracle Data Visualization Desktop v3"></p> <p><strong>Top N and bottom N filtering</strong> has been added for each measure columns expanding the traditional "range" one. </p> <p>Two new visualizations have also been included: <strong>waterfall</strong> and <strong>boxplot</strong> are now default visualizations. Boxplots were available as plugin in previous versions, however the five number summary had to be pre-calculated; in DVD v3 the summary is automatically calculated based on the definition of <em>category</em> (x-axis) and <em>item</em> (value within the category).</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/07/Boxplot-and-Waterfall.png" alt="Oracle Data Visualization Desktop v3"></p> <p>Other new options in the data visualization area include: the usage of <strong>logarithmic scale</strong> for graphs, the type of <strong>interpolation line</strong> to use (straight, curved, stepped ...), and the possibility to <strong>duplicate and reorder canvases</strong> (useful when creating a BI story).</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/07/Interpolation-line.png" alt="Oracle Data Visualization Desktop v3"></p> <h1 id="console">Console</h1> <p>The latest set of enhancements regard the <strong>console</strong>: this is a new menu allowing end users to perform task like the upload of a plugin that before were done manually on the file system.</p> <p>The new <a href="https://www.oracle.com/goto/OAStore">Oracle Analytics Store</a> lists add-ins divided into categories:</p> <ul> <li><strong>PlugIn</strong>: New visualizations or enhancement to existing ones (e.g. auto-refresh, providing a similar behaviour to OBIEE's slider)</li> <li><strong>Samples</strong>: Sample projects showing detailed DVD capabilities</li> <li><strong>Advanced Analytics</strong>: custom R scripts providing non-default functionalities</li> <li><strong>Map Layers</strong>: JSON shape files that can be used to render custom maps data.</li> </ul> <p>The process to include a new plugin into DVD v3 is really simple: after downloading it from the store, I just need open DVD's console and upload it. After a restart of the tool, the new plugin is available.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/07/custom-plugin.png" alt="Oracle Data Visualization Desktop v3"></p> <p>The same applies for Map Layers, while custom R scripts still need to be stored under the <code>advanced_analytics\script_repository</code> subfolder under the main DVD installation folder.</p> <p>As we saw in this blog post, the new Data Visualization Desktop release includes several enhancement bringing more agility in the data discovery with enhancements both in the connections to new sources (JDBC and ODBC) and standard reporting with OBIEE subject areas now accessible. The new visualizations, the Analytics Store and the plugin management console make the end user workflow extremely easy also when non-default features need to be incorporated. If you are interested in Data Visualization Desktop and want to understand how it can be proficiently used against any data source don't hesitate to <a href="mailto:info+ftdvdv3@rittmanmead.com">contact us</a>! </p> Francesco Tisiot b9525769-49aa-4f31-bbf0-23550d1e629e Wed Jul 05 2017 08:57:17 GMT-0400 (EDT) Oracle Hyperion Financial Reporting (FR) Web Studio: FR Studio with a Fresh Coat of Paint http://blog.performancearchitects.com/wp/2017/07/05/oracle-hyperion-financial-reporting-fr-web-studio-fr-studio-with-a-fresh-coat-of-paint/ <p>Author: Mohan Chanila, Performance Architects</p> <p>Back in October of 2016, the announcement was made that the traditional <a href="http://www.oracle.com/technetwork/middleware/epm/downloads/financial-reporting-1112x-2409228.html">Oracle Hyperion Financial Reporting (FR) Studio</a> desktop client that we all loved (or loathed), was going to be replaced by a new Web based report developer called the.. wait for it… <a href="https://docs.oracle.com/cd/E57185_01/HFWCG/toc.htm">Oracle Hyperion Financial Reporting (FR) Web Studio</a>. While the name itself isn’t the most original, I’ve had the chance to use this new interface significantly over the last few months.</p> <p>Since that announcement and over the last few months, Oracle has made a concerted effort to completely retire the old FR Studio desktop client and replace it with the brand-new FR Web Studio.</p> <p>In our opinion here at Performance Architects, this is all good news and it’s been a long time coming. The traditional FR Studio user interface hasn’t had a makeover in a long time and I’ve always felt that it was due an overhaul.</p> <p>So, let’s discuss the improvements in more detail.</p> <p><strong>Easy Access</strong></p> <p>FR Web Studio can be accessed directly from the Planning user interface. I’m very familiar with how to access the solution, having used this functionality in <a href="https://cloud.oracle.com/planning-and-budgeting-cloud">Oracle Planning and Budgeting Cloud Service (PBCS)</a> implementations. Access is available by clicking on the “Navigator” icon and clicking on the FR Web Studio link as shown below:</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/mc1.jpg"><img class="alignnone size-medium wp-image-2050" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/mc1-300x207.jpg" alt="" width="300" height="207" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/mc1-300x207.jpg 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/mc1-768x531.jpg 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/mc1-624x431.jpg 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/mc1.jpg 945w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p><strong>Improved User Interface</strong></p> <p>The best news about the new user interface in FR Web Studio is its quick response time. Traditional FR Studio response time was always slow and cumbersome by comparison. However, in the new UI, the clicks and general navigation is fast and responsive.</p> <p><strong>Enhanced Report Structure</strong></p> <p>While the overall report design has stayed the same, the report structure has massively changed. Previously, you had to work with a single report screen where you had to drag or insert the header, the footer as well as the main body (or grid) of the report. This was sometimes ungainly to work with and additionally, anytime a header or footer had to be resized, the only option was the drag the screen to the required size.</p> <p>The good news is that in the new interface, these three components are now separated into three sections and can be worked on independently. The resizing option has changed as well, and rather than dragging the screen size, it can be changed in the right-hand side properties window.al period getting used to the navigation, it’s a pretty easy transition, as most of the solution capabilities should be familiar to experienced FR Studio users.</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/mc2.jpg"><img class="alignnone size-medium wp-image-2049" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/mc2-300x152.jpg" alt="" width="300" height="152" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/mc2-300x152.jpg 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/mc2-768x389.jpg 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/mc2-1024x518.jpg 1024w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/mc2-624x316.jpg 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/06/mc2.jpg 1435w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p>Apart from the above-mentioned changes which are more drastically different from previous FR Studio iterations, other aspects haven’t changed as much. This simply means after an initial period getting used to the navigation, it’s a relatively easy transition, as most of the solution capabilities should be familiar to experienced FR Studio users.</p> Melanie Mathews http://blog.performancearchitects.com/wp/?p=2047 Wed Jul 05 2017 05:31:15 GMT-0400 (EDT) Common Questions and Misconceptions in The Data Science Field http://www.rittmanmead.com/blog/2017/07/questions-and-misconceptions-in-the-data-science-field/ <p>There are many types of scenarios in which data science could help your business. For example, customer retention, process automation, improving operational efficiency or user experience.</p> <p>It is not however always initially clear which questions to concentrate on, or how to achieve your aims. </p> <p>This post presents information about the type of questions you could address using your data and common forms of bias that may be encountered. </p> <h2 id="typesofquestion">Types of Question</h2> <ul> <li><p>Descriptive: Describe the main features of the data, no implied meaning is inferred. This will almost always be the first kind of analysis performed on the data.</p></li> <li><p>Exploratory: Exploring the data to find previously unknown relationships. Some of the found relationships may define future projects. </p></li> <li><p>Inferential: Looking at trends in a small sample of a data set and extrapolating to the entire population. In this type of scenario you would end up with an estimation of the value and an associated error. Inference depends heavily on both the population and the sampling technique.</p></li> <li><p>Predictive: Look at current and historical trends to make predictions about future events. Even if x predicts y, x does not cause y. Accurate predictions are hard to achieve and depend heavily on having the correct predictors in the data set. Arguably more data often leads to better results however, large data sets are not always required.</p></li> <li><p>Causal: To get the real relationship between variables you need to use randomised control trials and measure average effects. i.e. if you change x by this much how does y change. Even though this can be carried out on observed data huge assumptions are required and large errors would be introduced into the results. </p></li> </ul> <h2 id="biasesindatacollectionorcleaning">Biases in data collection or cleaning</h2> <p>It is very easy to introduce biases into your data or methods if you are not careful. <br> Here are some of the most frequent:</p> <ul> <li><p>Selection/sampling bias: If the population selected does not represent the actual population, the results are skewed. This commonly occurs when data is selected subjectively rather than objectively or when non-random data has been selected. </p></li> <li><p>Confirmation bias: Occurs when there is an intentional or unintentional desire to prove a hypothesis, assumption, or opinion.</p></li> </ul> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/dt_c110702.gif" alt=""></p> <ul> <li><p>Outliers: Extreme data values that are significantly out of the normal range of values can completely bias the results of an analysis. If the outliers are not removed in these cases the results of the analysis can be misleading. These outliers are often interesting cases and ought to be investigated separately. </p></li> <li><p>Simpson's Paradox: A trend that is indicated in the data can reverse when the data is split into comprising groups. </p></li> <li><p>Overfitting: Involves an overly complex model which overestimates the effect/relevance of the examples in the training data and/or starts fitting to the noise in the training data. </p></li> </ul> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Untitled-Diagram-3.png" alt=""></p> <ul> <li><p>Underfitting: Occurs when the underlying trend in the data is not found. Could occur if you try to fit a linear model to non linear data or if there is not enough data available to train the model.</p></li> <li><p>Confounding Variables: Two variables may be assumed related when in fact they are both related to an omitted confounding variable. This is why correlation does not imply causation. </p></li> </ul> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screen-Shot-2017-06-05-at-14.24.06.png" alt=""> <img src="http://www.rittmanmead.com/blog/content/images/2017/06/dt111128.gif" alt=""></p> <ul> <li>Non-Normality: If a distribution is assumed to be normal when it is not the results may be biased and misleading. </li> </ul> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/tumblr_nunwnqFJUp1tpri36o1_540.jpg" alt=""></p> <ul> <li>Data Dredging: This process involves testing huge numbers of hypotheses about a single data set until the desired outcome is found. </li> </ul> <h2 id="citations">Citations:</h2> <p>Comics from <a href="http://dilbert.com">Dilbert Comics By Scott Adams</a>. <br> Spurious Correlations from <a href="http://tylervigen.com/spurious-correlations">http://tylervigen.com/spurious-correlations</a>.</p> <h2 id="insightslab">Insights Lab</h2> <p>To learn more about the Rittman Mead Insights Lab please read my previous blog post about <a href="https://www.rittmanmead.com/blog/2017/06/exploring-the-rittman-mead-insights-lab/">our methodology</a>. </p> <p>Or contact us at <a href="mailto:info+insights@rittmanmead.com?subject=InsightsLab">info@rittmanmead.com</a></p> Hannah Patrick 565f992a-d2af-4517-a4be-126036ee53bb Tue Jul 04 2017 10:06:00 GMT-0400 (EDT) Part 2 - Do I have permissions to use the data for data profiling? http://www.oralytics.com/2017/07/part-2-do-i-have-permissions-to-use.html <p>This is the second part of series of blog posts on '<a href="http://www.oralytics.com/2017/06/how-eu-gdpr-will-affect-use-of-machine.html">How the EU GDPR will affect the use of Machine Learning</a>'</p> <p>I have the data, so I can use it? Right? </p><p>I can do what I want with that data? Right? (sure the customer won't know!)</p> <p>NO. The answer is No you cannot use the data unless you have been given the permission to use it for a particular task. </p> <p>The GDPR applies to all companies worldwide that process personal data of European Union (EU) citizens. This means that any company that works with information relating to EU citizens will have to comply with the requirements of the GDPR, making it the first global data protection law.</p> <p><img src="https://lh3.googleusercontent.com/-I00jaywB7ww/WVOIuRpA_dI/AAAAAAAAML0/-NLqAJbqn6QbuoSt2yEctdE5fl3CtJI_QCHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="400" height="180" /></p> <p>The GDPR tightens the rules for obtaining valid consent to using personal information. Having the ability to prove valid consent for using personal information is likely to be one of the biggest challenges presented by the GDPR. Organisations need to ensure they use simple language when asking for consent to collect personal data, they need to be clear about how they will use the information, and they need to understand that silence or inactivity no longer constitutes consent.</p> <p><img src="https://lh3.googleusercontent.com/-_-iZgK10aqw/WVOI6cWrR2I/AAAAAAAAML4/7l-FNnHMySUGj2adiwbTXqRGCWxXT_ytACHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="430" height="180" /></p> <p>You will need to investigate the small print of all the terms and conditions that your customers have signed. Then you need to examine what data you have, how and where it was collected or generated, and then determine if I have to use this data beyond what the original intention was. If there has been no mention of using the customer data (or any part of it) for analytics, profiling, or anything vaguely related to it then you cannot use the data. This could mean that you cannot use any data for your analytics and/or machine learning. This is a major problem. No data means no analytics and no targeting the customers with special offers, etc.</p> <p><img src="https://lh3.googleusercontent.com/-1bEVeepHjo0/WVOOUFRKvMI/AAAAAAAAMMI/x94O9D92kCYCRpVh5YCjbh2piUD7EbhaACHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="430" height="180" /></p> <p>Data cannot be magically produced out of nowhere and it isn't the fault of the data science team if they have no data to use.</p> <p>How can you over come this major stumbling block?</p> <p>The first place is to review all the T&Cs. Identify what data can be used and what data cannot be used. One approach for data that cannot be used is to update the T&Cs and get the customers to agree to them. Yes they need to explicitly agree (or not) to them. Giving them a time limit to respond is not allowed. It needs to be explicit.</p> <p><img src="https://lh3.googleusercontent.com/-UVR7GF7d5yE/WVOPQKxSbAI/AAAAAAAAMMU/3D1mK-QlrAI9O5xfEjeGZP9Zw3Xf-l7DwCHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="186" height="185" /></p> <p>Yes this will be hard work. Yes this will take time. Yes it will affect what machine learning and analytics you can perform for some time. But the sooner you can identify these area, get the T&Cs updated, get the approval of the customers, the sooner the better and ideally all of this should be done way in advance on 25th May, 2018.</p> <p><img src="https://lh3.googleusercontent.com/-7wnBwUnyLSg/WVOP2XjMdSI/AAAAAAAAMMc/A40QSH_o9xc-vjr28sLHqmzsgi0iHpL1gCHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="228" height="185" /></p> <p>In the next blog post I will look at addressing Discrimination in the data and in the machine learning models.</p> <br><p>Click back to '<a href="http://www.oralytics.com/2017/06/how-eu-gdpr-will-affect-use-of-machine.html">How the EU GDPR will affect the use of Machine Learning - Part 1</a>' for links to all the blog posts in this series.</p> Brendan Tierney tag:blogger.com,1999:blog-4669933501315263808.post-5285802247339147235 Mon Jul 03 2017 11:35:00 GMT-0400 (EDT) OAC: Essbase – Incremental Loads / Automation http://www.rittmanmead.com/blog/2017/07/oac-essbase-incremental-loads-automation/ <p>I recently detailed <strong><em>data load possibilities with the tools provided with Essbase under OAC</em></strong> <a href="https://www.rittmanmead.com/blog/2017/06/oac-essbase-loading-data/">here</a>. Whilst all very usable, my thoughts turned to systems that I have worked on and how the loads currently work, which led to how you might perform incremental and / or automated loads for OAC Essbase. </p> <p>A few background points:</p> <ul> <li>The OAC front end and EssCS command line tools contain a ‘clear’ option for data, but both are full data clears – there does not seem to be a partial or specifiable ‘clear’ available. </li> <li>The OAC front end and EssCS command line tools contain a ‘file upload’ function for (amongst other things) data, rules, and MAXL (msh) script files. Whilst the front-end operation has the ability to overwrite existing files, the EssCS Upload facility (which would be used when trying to script a load) seemingly does not – if an attempt is made to upload a file that already exists, an error is shown.</li> <li>The OAC ‘Job’ facility enables a data load to be conducted with a rules file; the EssCS Dataload function (which would be used when trying to script a load) seemingly does not.</li> <li>MAXL still exists in OAC, so it is possible to operate at Essbase ‘command level’</li> </ul> <p>Whilst the tools that are in place all work well and are fine for migration or other manual / adhoc activity, I am not sure what the intended practice might be around some ‘real world’ use cases: a couple of things that spring to mind are </p> <ul> <li>Incremental loads </li> <li>Scheduled loads </li> <li>Large ASO loads (using buffers)</li> </ul> <h3 id="incrementalloads">Incremental Loads</h3> <p>It is arguably possible to perform an incremental load in that </p> <ul> <li>A rules file can be crafted in on-prem and uploaded to OAC (along with a partial datafile) </li> <li>Loads appear to be conducted in overwrite mode, meaning changed and new records will be handled ok</li> </ul> <p>It is possible that (eg) a ‘current month’ data file could be loaded and reloaded to form an incremental load of sorts. The problem here will come if data is deleted for a particular member combination in the source from one day to the next – with no partial clear (eg, of current month data) seemingly possible, there is no way of clearing redundant values (at least for an ASO cube…for a BSO load, the ‘Clear combinations’ functionality of the load rules file can be used…although that has not yet been tested on this version).</p> <p>So in the case of an ASO cube, the only option using available tools would be to ensure that ‘contra’ records are added to the incremental load file. This is not ideal, as it is another process to follow in data preparation, and would also add unnecessary zeros to the cube. For these reasons, I would generally look to effect a partial clear of the ‘slice’ being loaded before proceeding with an incremental the load. </p> <p>The only way I can see of achieving this under OAC would be to take advantage of the fact that MAXL is available and effect the clear using <em>alter database clear data</em>.</p> <p>This means that the steps required might be </p> <ul> <li>Upload prepared incremental data file (either manually via OAC or via EssCS UploadFiles after having first deleted the existing file)</li> <li>Upload on-prem prepared rules file (either manually via OAC or via EssCS UploadFiles after having first deleted the existing file)</li> <li>Access the OAC server (eg via Putty), start MAXL, and run a command to clear the required slice / merge slices (if necessary)</li> <li>In OAC, create / run a job for the specified data file / rules file</li> </ul> <p>I may have missed something, but I see no obvious way of being able to automate this process with the on-board facilities. </p> <h3 id="automatingtheloadprocess">Automating the load process</h3> <p>Along with the points listed above, some other facts to be aware of: </p> <ul> <li>It is possible to manually transfer files to OAC using FTP</li> <li>It is possible to amend the cron scheduler for the <strong>oracle</strong> user in OAC</li> </ul> <p>Even bearing in mind the above, I should caveat this section by saying getting ‘under the hood’ in this way is possibly not supported or recommended, and should only be undertaken at your own risk. </p> <p>Having said that…</p> <p>By taking advantage of the availability of FTP and cron, it should be possible to script a solution that can run unattended, for full and incremental loads. Furthermore, data clears (full or partial) can be included in the same process, as could parallel buffer loading for ASO or any other MAXL-controllable process (within the confines of this version of Essbase). </p> <h3 id="theoacenvironment"><em>The OAC environment</em></h3> <p>A quick look around discloses that the <em>/u01/latency</em> directory is roughly the equivalent of the <em>../user_projects/epmsystem1/EssbaseServer/essbaseserver1</em> (or equivalent) directory in an on-prem release in that it contains the <em>/app</em> ‘parent’ directory which in turn contains a subdirectory structure for all application and cube artefacts. Examining this directory for <strong>ASOSamp.Basic</strong> shows that the uploaded <strong>dataload.*</strong> files are here, along with all other files listed by the <strong>Files</strong> screen of OAC: </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog28.png" alt=""></p> <p>Note that remote connection is via the <strong>opc</strong> user, but this can be changed to <strong>oracle</strong> once connected (by using <em>sudo su – oracle</em>). </p> <p>As <strong>oracle</strong>, these files can be manually deleted…doing so means they will no longer be found by the EssCS <strong>Listfiles</strong> command or the <strong>Files</strong> screen within OAC (once refreshed). If deleted manually, new versions of the files can be re-uploaded via either of the methods detailed above (whilst an overwrite option exists in the OAC <strong>Files</strong> facility, there seems to be no such option with the EssCS <strong>Upload</strong> feature…trying to upload a file that already exists results in an error. </p> <p>All files are owned by the <strong>oracle</strong> user, with no access rights at all for the <strong>opc</strong> user that effects a remote connection via FTP.</p> <h3 id="automationobjectives"><em>Automation: Objectives</em></h3> <p>The objective of this exercise was to come up with a method that, unattended, would:</p> <ul> <li>Upload received files (data, rules) to OAC from a local source</li> <li>Put them in the correct OAC directory in a usable format </li> <li>Invoke a process that runs a pre-load process (eg a clear), a load, and (if necessary a post load process)</li> <li>Clear up after itself</li> </ul> <h3 id="automationtheprocess"><em>Automation: The Process</em></h3> <p>The first job is to handle the upload of files to OAC. This could be achieved via a <strong><em>psftp</em></strong> script that uploads the entire contents of a nominated local directory:</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog29.png" alt=""></p> <p>The <strong>Ess<em>CS</em>Upload,bat</strong> script above (which can, of course be added to a local scheduler so that it runs unattended at appointed times) passes a pre-scripted file to <strong>psftp</strong> to connect and transfer the files. Note that the <strong>opc</strong> user is used for the connection, and the files are posted to a custom-created directory, <em>CUSTOM_receive</em> (under the existing <em>/u01/latency</em>). The transferred files are also given a global ‘rw’ attribute to assist with later processing</p> <p>Now the files are in the OAC environment, control is taken up there.</p> <p>A shell script (<em>DealWithUploads</em>) is added to the <strong>oracle</strong> home directory:</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog30.png" alt=""></p> <p>This copies all the files in the nominated receiving directory to the actual required location – in this case, the main <em>ASOSamp/Basic</em> directory. Note the use of ‘-p’ with the copy command to ensure that attributes (ie, the global ‘rw’) are retained. Once copied, the files are deleted from the receiving directory so that they are not processed again. </p> <p>Once the files are copied into place, startMAXL is used to invoke a pre-prepared msh script: </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog31.png" alt=""></p> <p>as can be seen, this clears the cube and re-imports from the uploaded file using the uploaded rules file. The clear here is a full reset, but a partial clear (in the case of ASO) can be used here instead if required</p> <p>As with the ‘local’ half of the method, the <strong>DealWithUploads.sh</strong> script file can be added to the scheduler on OAC: the existing cron entries are already held in the file /u01/app/oracle/tools/home/oracle/crontab.txt; it is a simple exercise to schedule a call to this new custom script.</p> <p>A routine such as this would need a good degree of refinement and hardening – the file lists for the transfers should be self-building, passwords need to be encrypted, the MAXL script should only be called if required, the posting locations for files should be content/context sensitive, etc – but in terms of feasibility testing the requirements listed above, it was successful. </p> <p>This approach places additional directories and files in an environment / structure that could be maintained at any time: it is therefore imperative that some form of code control / release mechanism is employed so that it can be replaced in the event of any unexpected / uncontrollable maintenance taking place on the OAC environment that could invalidate or remove it. </p> <p>Even once hardened, I think there is a considerable weak spot in this approach in that the rules file seemingly has to be crafted in an on-prem environment and uploaded: as I detailed <a href="https://www.rittmanmead.com/blog/2017/06/oac-essbase-and-dvcs/">here</a>, even freshly-uploaded, working rules files error when an attempt is made to verify them. For now, I’ll keep looking for an alternative. </p> <h3 id="summary">Summary</h3> <p>Whilst a lot of the high-level functionality is in place around data loads, often with multiple methods, I think there are a couple of detailed functionality areas that may currently require workarounds – to my mind, the addition of the ability to select &amp; run an msh format ‘preload’ script when running a dataload Job (eg for clears) would be useful, whilst a fully functional rules file editor strikes me as important. The fact that an FTP connection is available at all is a bonus, but because this is as a non-<strong>oracle</strong> user, it is not possible to put a file in the correct place directly - the EssCS <strong>Upload</strong> faciity does this of course, but the seeming absence of an overwrite option or an additional <strong>Delete</strong> option for EssCS) somewhat limits its usefulness at this point. But can you implement a non attended, scheduled load or incremental load routine ? Sure you can. </p> Mark Cann 4120ce4d-0ddc-4be6-bdf3-31d8350eeadf Mon Jul 03 2017 09:56:00 GMT-0400 (EDT) How OAC Benefits The Finance Analyst (Ep 051) https://www.youtube.com/watch?v=1TxetRNQJoU Red Pill Analytics yt:video:1TxetRNQJoU Mon Jul 03 2017 09:48:33 GMT-0400 (EDT) OAC: Essbase – Loading Data http://www.rittmanmead.com/blog/2017/06/oac-essbase-loading-data/ <p>After my initial quick pass through Essbase under OAC <a href="https://www.rittmanmead.com/blog/2017/06/oac-essbase-and-dvcs/">here</a>, this post looks at the data loading options available in more detail. I used the provided sample database <strong>ASOSamp.Basic</strong>, which first had to be created, as a working example.</p> <h3 id="creatingasosamp">Creating ASOSamp</h3> <p>Under the time-honoured on-prem install of Essbase, the sample applications were available as an install option – supplied data has to be loaded separately, but the applications / cubes themselves are installed as part of the process if the option is selected. This is not quite the same under OAC – some are provided in an easily installable format, but they are not immediately available out-of-the-box. </p> <p>One of the main methods of cube creation in Essbase under OAC is via the <strong>Import</strong> of a specifically formatted Excel spreadsheet, and it is via the provision of downloadable pre-built ‘template’ spreadsheets that the sample applications are installed in this version. </p> <p>After accessing the homepage of Essbase on OAC, download the provided cube creation template – this can be found under the ‘Templates’ button on the home page: </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog1-1.png" alt=""></p> <p>Note that in the case of the sample the ASOSamp.Basic database, the data is <strong>not</strong> in the main template file – it is held in a separate file. This is different to other examples, such as <strong>Sample.Basic</strong>, where the data provided is held in a dedicated tab in the main spreadsheet. Download both <strong>Aggregate Storage Sample</strong> and <strong>Aggregate Storage Sample Data</strong>: </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog2-1.png" alt=""></p> <p>Return to the home page, and click <strong>Import</strong>. Choose the spreadsheet downloaded as <strong>Aggregate Storage Sample</strong> (ASO_Sample.xlsx) and click <strong>Deploy and Close</strong>. </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog3.png" alt=""></p> <p>This will effect all of the detail in the spreadsheet – create the application, create the cube, add dimensions / attribute dimensions and members to the outline, etc:</p> <h3 id="loadingasosampbasic">Loading ASOSamp.Basic</h3> <p>Because the data file is separate from the spreadsheet, the next step is to uploaded this to OAC so that it is available for loading: back on the home page, select the newly-created <strong>ASOSamp.Basic</strong> (note: not ASOSamp.Sample as with on-prem), and click <strong>Files</strong>:</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog4-1.png" alt=""></p> <p>In the right-hand window, select the downloaded data file <strong>ASO<em>Sample</em>Data.txt</strong> and click the <strong>Upload</strong> button:</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog5-1.png" alt=""></p> <p>This will upload the file:</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog6.png" alt=""></p> <p>Once the file upload is complete, return to the home page. With the newly-created <strong>ASOSamp.Basic</strong> still selected, click <strong>Jobs</strong>:</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog7.png" alt=""></p> <p>Choose <strong>Data Load</strong> as the Job Type, and highlight the required <strong>Data File</strong>:</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog8.png" style="width: 300px;"></p> <p>Click <strong>Execute</strong>.</p> <p>A new line will be added to the Job Monitor:</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog9-1.png" alt=""></p> <p>The current status of the job is shown – in this case, ‘in progress’ – and the screen can be refreshed. </p> <p>Once complete, the Status field will show the completion state of the job, whilst the Job Details icon on the right-hand side provides more detail – in this case, confirming that 311,795 records were successfully loaded, and 0 rejected: </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog10.png" alt=""></p> <p>The success of the load is confirmed by a quick look in Smartview:</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog12.png" alt=""></p> <p>Note that a rules file was not selected as part of the job – this makes sense when we look at the data file…</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog11-1.png" alt=""></p> <p>...which is familiar-looking: just what we would expect from a EAS <strong>export</strong> (MAXL: <em>export database</em>), which can of course just be loaded in a similar no-rules-file way in on prem. </p> <p><em>Incidentally, this is different to the on-prem approach to ASOSamp.Sample where a ‘flat’, tab-delimited data file is provided for the sample data, along with a rules file that is required for the load:</em></p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog16-onprem1.png" alt=""></p> <p><em>...although the end-results are the same:</em></p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog17-onprem-result.png" alt=""></p> <p>This ‘standard’ load works in overwrite mode – any new values in the file will be added, but any that exist already will be overwritten: running the load again and refreshing the Smartview report results in the same numbers confirms this. </p> <p>This can be verified further by running with a changed data file: taking a particular node of data for the Units measure… </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog13.png" alt=""></p> <p>One of the constituent data values can be changed in a copy of the data file – in this example, one record (it doesn’t matter which for this purpose) can be increased – in this case, ‘1’ has been increased to ‘103’:</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog14.png" alt=""></p> <p>The amended file needs to be saved and uploaded to OAC as outlined above, and the load process repeated, this time using the amended file. After a successful load, the aggregated value on the test Smartview report has increased by the same 102: </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog15.png" alt=""></p> <h3 id="loadingflatfiles">Loading flat files</h3> <p>So, how might we load the same sort of flat, tab delimited file of the like supplied as the on-prem ASOSamp.Sample data file ? </p> <p>As above, files can be uploaded to OAC, so putting the <strong>dataload.txt</strong> data file from the on-prem release into OAC is straightforward. However, as you’d expect, attempting to run this as a load job without a rules file results in an error. </p> <p>However, it is possible to run an OAC load with a rules file <em>created in an on-prem version</em>: firstly, upload the rules file (in this case, <strong>dataload.rul</strong>) in the same way as the data file. When setting up the load job, select the data file as normal, but under Scripts select the rules file required: </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog18.png" style="width: 300px;"></p> <p>The job runs successfully, with the ‘Details’ overlay confirming the successful record count. </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog19.png" alt=""></p> <p>As with rules files generated by the Import facility, uploaded rules files can also be edited in text mode:</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog20.png" alt=""></p> <p>It would seem logical that changing the <strong><em>dataLoadOptions</em></strong> value at line 215 to a value other than OVERWRITE (eg ADD) might be a quick behavioural change for the load that would be easy to effect. However, making this change resulted in verification errors. Noting that the errors related to invalid dimension names, an attempt was made to verify the actual, unchanged rules file as uploaded…which also resulted in the same verification errors. So somewhat curiously, the uploaded on-prem rules file can be successfully used to load a corresponding data file, but (effectively) can’t be edited or amended.</p> <h3 id="loadingfromspreadsheettemplate">Loading from Spreadsheet Template</h3> <p>The template spreadsheets used to build applications can also contain one or more data tabs. Unlike the OAC Jobs method or EssCS Dataload, the spreadsheet method gives you the option of a rules file AND the ability to Add (rather than overwrite) data: </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog32.png" alt=""></p> <p>Within OAC, this is actioned via the ‘Import’ function on the home page: </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog33.png" alt=""></p> <p>Note that we are retaining all data, and have the Load Data box checked. Checks confirm the values in the file are added to those already in the cube. </p> <p>The data can also be uploaded via the Cube Designer in Excel under Cube Designer / Load Data: </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog34.png" alt=""></p> <p>Note that unlike running this method under OAC, the rules file (which was created by the initial import as the Data tab existed in the spreadsheet at that point) has to be selected manually. </p> <p>Once complete, an offer is made to view the Job Status Viewer (which can also be accessed from Cube Designer / View Jobs):</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog35.png" alt=""></p> <p>With further detail for each job also being available: </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog36.png" style="width: 350px;"></p> <h3 id="usefacilitiestouploadfiles">Use facilities to upload files</h3> <p>Given the ability to upload and run both data and rules files, the next logical step would be to script this for automated running. OAC contains a downloadable utility, the <strong>Command Line Tool</strong> (aka CLI , EssCS) which is a number of interface tools that can be run locally against an OAC instance of Essbase:</p> <p>Login / Logout <br> Calc <br> Dataload <br> Dimbuild <br> Clear <br> Version <br> Listfiles <br> Download <br> Upload <br> LcmExport <br> LcmImport</p> <p>Running locally, a successful EssCS login effectively starts a session that then remains open for other EssCS commands until the session is closed with a logout command. </p> <p>The login syntax suggests the inclusion of the port number in the URL, but I had no success with this…although it worked without the port reference:</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog21.png" alt=""></p> <p>As above, the connection is made and is verified by the successful running of another command (eg version), but the <strong>logout</strong> command produced an error. Despite this, the logout appeared successful – no other EssCS commands worked until a login was re-issued. </p> <p>With EssCS installed and working, the <strong>Listfiles</strong> and <strong>Upload</strong> facilities become available. The function of these tools is pretty obvious from the name. <strong>Listfiles</strong> should be issued with at least arguments for the application and cube name: </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog22.png" alt=""></p> <p>The file type (csc, rul, txt, msh, xls, xlsx, xlsm, xml, zip, csv) can be included as an additional argument…</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog23.png" alt=""></p> <p>…although the file types is a fixed list – for example, you don’t seem to be able to use a wild card to pick up all spreadsheet files.</p> <p>Whilst there is an <strong>Upload</strong> (and <strong>Download</strong>) facility, there does not seem to be the means to delete a remote file…which is a bit of an inconvenience, because using Upload to upload a file that already exists results in an error, and there is no overwrite option. The <strong>dataload.txt</strong> and <strong>dataload.rul</strong> files previously uploaded via the OAC front end were therefore manually deleted via OAC, and verified using <strong>Listfiles</strong>. </p> <p>The files were then uploaded back to OAC using the Upload option of EssCS: </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog25.png" alt=""></p> <p>As you would expect, the files will then appear both in a Listfiles command and via OAC: </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog24.png" alt=""></p> <p>Note that the file list in OAC does not refresh with a browser page refresh or any ‘sort’ operation: use <strong>Refresh</strong> under <strong>Actions</strong> as above. </p> <p>With the files now re-uploaded, the data can be loaded. EssCS also contains a <strong>DataLoad</strong> command, but unfortunately there appears to be no means to specify a rules file – meaning it would seem to be confined to overwrite, ‘export data’ style imports only: </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog26.png" alt=""></p> <p>A good point here is that the a <strong>DataLoad</strong> EssCS command makes an entry to the Jobs table, so success / record counts can be confirmed:</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/blog27.png" alt=""></p> <h3 id="summary">Summary</h3> <p>The post details three methods of loading data to Essbase under OAC:</p> <ul> <li>Via the formatted template spreadsheet (on import or from Cube Designer)</li> <li>Via the Command Line Interface</li> <li>Via the Jobs facility of OAC</li> </ul> <p>There are some minor differences between them, which may affect which you may wish to use for any particular scenario. </p> <p>Arguably, given the availability of MAXL, there is a further custom method available as the actual data load can be effected that way too. This will be explored further in the next post that will start to consider how these tools might be used for real scenarios. </p> Mark Cann de19e59b-64a7-4c96-8bb9-213b817cf9e3 Fri Jun 30 2017 10:33:15 GMT-0400 (EDT) Usability, Product Management, and LinkedIn - a rant http://bi.abhinavagarwal.net/2017/06/usability-product-management-and.html <div dir="ltr" style="text-align: left;" trbidi="on"><span style="color: black; float: left; font-family: &quot;times&quot; , serif , &quot;georgia&quot;; font-size: 48px; line-height: 30px; padding-right: 2px; padding-top: 2px;">L</span><br />inkedIn began as a professional networking site, has evolved into a social media behemoth, and has yet managed to maintain and sharpen its focus on the professional space. That may, in part, explain why, in 2016, Microsoft chose to put down more than <a href="https://www.wsj.com/articles/microsoft-to-acquire-linkedin-in-deal-valued-at-26-2-billion-1465821523" rel="nofollow noopener" saprocessedanchor="true" target="_blank">$26 billion Washingtons to buy LinkedIn</a>.<br /><div class="prose" itemprop="articleBody">While both LinkedIn's web site and mobile app have undergone substantial changes over the years, and is a far cry from the spartan look both sported just a few years ago, I wanted to call out one peculiarity - call it eccentricity - that the site has. I would call it a <strong>glaring UX and product management</strong> miss, if you will.<br />Let me elaborate.<br /><div class="slate-resizable-image-embed slate-image-embed__resize-left" data-imgsrc="https://media.licdn.com/mpr/mpr/AAEAAQAAAAAAAAqlAAAAJDY4MGJmMjliLTE2YjAtNDAxMC04MGQ1LTE1MmIzZjg0MjM4Mg.png"><table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody><tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-_i2XT4KtPkI/WVXkVCs9VII/AAAAAAAAOQU/Tp_PZ956GzMbYw1QSfxc1ooeApP55scTACLcBGAs/s1600/01377.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="605" data-original-width="639" height="302" src="https://4.bp.blogspot.com/-_i2XT4KtPkI/WVXkVCs9VII/AAAAAAAAOQU/Tp_PZ956GzMbYw1QSfxc1ooeApP55scTACLcBGAs/s320/01377.png" width="320" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">email from LinkedIn in June 2014, announcing the launch of the publish feature.</td></tr></tbody></table></div>Sometime in April 2014, LinkedIn introduced a feature that allowed users - by invitation at first, and everyone later - to <strong>publish their articles on LinkedIn</strong>. This feature is now a great source of user-generated content for LinkedIn, helping drive more traffic to its website. I have written a few over the last couple of years, and it's a great way to my thoughts on relevant topics in front of a relevant audience.<br /><h3>But Where Are My Articles?</h3>From the LinkedIn home page, try finding a way to navigate to your articles - published or in draft mode. Go ahead, I will wait while you wander on the home page.<br />You can't.<br />Let me show. See the screenshot below. That is the home page I see when I go to LinkedIn.<br /><ol><li>The menu at the top contains no links to go to my articles.</li><li>I can click the 'Write an article' button and it will take me to the <a href="https://www.linkedin.com/post/new" saprocessedanchor="true" target="_blank">LinkedIn Publishing page</a>, and I can start penning pristine prose there.</li><li>I can click the headline and view <a href="https://www.linkedin.com/in/abhinavagarwal/recent-activity/shares/" saprocessedanchor="true" target="_blank">analytics on my articles or shares</a>.</li></ol><a href="https://2.bp.blogspot.com/-jAmQahEI-wE/WVXkVND7g6I/AAAAAAAAOQY/Deh1otPEBIUxvSEkcp_urXIqtdCAqzUCwCLcBGAs/s1600/01381.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" data-original-height="900" data-original-width="1582" height="363" src="https://2.bp.blogspot.com/-jAmQahEI-wE/WVXkVND7g6I/AAAAAAAAOQY/Deh1otPEBIUxvSEkcp_urXIqtdCAqzUCwCLcBGAs/s640/01381.png" width="640" /></a><br />But I <strong>still</strong> cannot view a list of my articles. I can't.<br /><div class="slate-resizable-image-embed slate-image-embed__resize-full-width" data-imgsrc="https://media.licdn.com/mpr/mpr/AAEAAQAAAAAAAArsAAAAJGMwZjJhMTQ4LTNhZDEtNDczNy05NjMxLTBjMjk2YWU3NjcyOQ.png"><br /></div><ul><li>If I go to the Publishing page, and if I click the 'More' dropdown, then voila, I can see that I have finally found what I was looking for. So will you too.</li></ul><div class="slate-resizable-image-embed slate-image-embed__resize-full-width" data-imgsrc="https://media.licdn.com/mpr/mpr/AAEAAQAAAAAAAAt1AAAAJDc3YTc3NjQ4LTcwYTQtNGNhMy05NzM1LWY2ZjM5ODBjOThhMg.png"><div class="separator" style="clear: both; text-align: center;"><a href="https://1.bp.blogspot.com/-TZ0w9nt6YHI/WVXkTbSNxqI/AAAAAAAAOQQ/my9_UwwvIesntR-Fl2wn3uUOahbByM5VgCLcBGAs/s1600/01382.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" data-original-height="581" data-original-width="1569" height="235" src="https://1.bp.blogspot.com/-TZ0w9nt6YHI/WVXkTbSNxqI/AAAAAAAAOQQ/my9_UwwvIesntR-Fl2wn3uUOahbByM5VgCLcBGAs/s640/01382.png" width="640" /></a></div></div>Why? Why make it so darn tough to find your own articles?<br /><ul><li>By design? Unlikely.</li><li>Oversight? Likely. A miss, from both product management and UX. Why is an important features such as this so difficult to find? It is not even available from the home page. Why is not anyone talking about discoverability? What about the <strong>scent of information</strong>? <a href="https://www.nngroup.com/people/jakob-nielsen/" rel="nofollow noopener" saprocessedanchor="true" target="_blank">Nielsen</a>, <a href="https://en.wikipedia.org/wiki/Alan_Cooper" rel="nofollow noopener" saprocessedanchor="true" target="_blank">Cooper</a>, <a href="https://en.wikipedia.org/wiki/Information_foraging" rel="nofollow noopener" saprocessedanchor="true" target="_blank">Pirolli</a>, anyone?</li></ul>Solution? Fix it. Fast.<br /><br /><iframe frameborder="0" marginheight="0" marginwidth="0" scrolling="no" src="//ws-na.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&amp;OneJS=1&amp;Operation=GetAdHtml&amp;MarketPlace=US&amp;source=ss&amp;ref=as_ss_li_til&amp;ad_type=product_link&amp;tracking_id=abhinav-20&amp;marketplace=amazon&amp;region=US&amp;placement=0672326140&amp;asins=0672326140&amp;linkId=57622779c641915614e3ef56d1ef65d1&amp;show_border=true&amp;link_opens_in_new_window=true" style="height: 240px; width: 120px;"></iframe> <iframe frameborder="0" marginheight="0" marginwidth="0" scrolling="no" src="//ws-in.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&amp;OneJS=1&amp;Operation=GetAdHtml&amp;MarketPlace=IN&amp;source=ss&amp;ref=as_ss_li_til&amp;ad_type=product_link&amp;tracking_id=abhisblog-21&amp;marketplace=amazon&amp;region=IN&amp;placement=0672326140&amp;asins=0672326140&amp;linkId=5471d42b3df9a8ab46d175960150fbef&amp;show_border=true&amp;link_opens_in_new_window=true" style="height: 240px; width: 120px;"></iframe> <iframe frameborder="0" marginheight="0" marginwidth="0" scrolling="no" src="//ws-in.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&amp;OneJS=1&amp;Operation=GetAdHtml&amp;MarketPlace=IN&amp;source=ss&amp;ref=as_ss_li_til&amp;ad_type=product_link&amp;tracking_id=abhisblog-21&amp;marketplace=amazon&amp;region=IN&amp;placement=B000OZ0N62&amp;asins=B000OZ0N62&amp;linkId=51686b7bfdc420d3672ab043ec985350&amp;show_border=true&amp;link_opens_in_new_window=true" style="height: 240px; width: 120px;"></iframe> <iframe frameborder="0" marginheight="0" marginwidth="0" scrolling="no" src="//ws-na.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&amp;OneJS=1&amp;Operation=GetAdHtml&amp;MarketPlace=US&amp;source=ss&amp;ref=as_ss_li_til&amp;ad_type=product_link&amp;tracking_id=abhinav-20&amp;marketplace=amazon&amp;region=US&amp;placement=B000OZ0N62&amp;asins=B000OZ0N62&amp;linkId=a0df61f59bfbd7f4cdcc931006adc601&amp;show_border=true&amp;link_opens_in_new_window=true" style="height: 240px; width: 120px;"></iframe> <iframe frameborder="0" marginheight="0" marginwidth="0" scrolling="no" src="//ws-in.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&amp;OneJS=1&amp;Operation=GetAdHtml&amp;MarketPlace=IN&amp;source=ss&amp;ref=as_ss_li_til&amp;ad_type=product_link&amp;tracking_id=abhisblog-21&amp;marketplace=amazon&amp;region=IN&amp;placement=B005OL9HGS&amp;asins=B005OL9HGS&amp;linkId=479cadb0231cfb201b90b78a9601e738&amp;show_border=true&amp;link_opens_in_new_window=true" style="height: 240px; width: 120px;"></iframe> <iframe frameborder="0" marginheight="0" marginwidth="0" scrolling="no" src="//ws-na.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&amp;OneJS=1&amp;Operation=GetAdHtml&amp;MarketPlace=US&amp;source=ss&amp;ref=as_ss_li_til&amp;ad_type=product_link&amp;tracking_id=abhinav-20&amp;marketplace=amazon&amp;region=US&amp;placement=B005OL9HGS&amp;asins=B005OL9HGS&amp;linkId=34cadf68299e0f7f7e07fa833e2c362f&amp;show_border=true&amp;link_opens_in_new_window=true" style="height: 240px; width: 120px;"></iframe> <iframe frameborder="0" marginheight="0" marginwidth="0" scrolling="no" src="//ws-na.amazon-adsystem.com/widgets/q?ServiceVersion=20070822&amp;OneJS=1&amp;Operation=GetAdHtml&amp;MarketPlace=US&amp;source=ss&amp;ref=as_ss_li_til&amp;ad_type=product_link&amp;tracking_id=abhinav-20&amp;marketplace=amazon&amp;region=US&amp;placement=0195387791&amp;asins=0195387791&amp;linkId=701796ddc84d61077594cd15cc5ef63a&amp;show_border=true&amp;link_opens_in_new_window=true" style="height: 240px; width: 120px;"></iframe><br /><br />[<i>this post first appeared in <a href="https://www.linkedin.com/">LinkedIn</a> on <a href="https://www.linkedin.com/pulse/linkedin-usability-rant-abhinav-agarwal">June 29th, 2017</a></i>]<br /><br /></div><span style="color: #666666; font-size: x-small;">© 2017, Abhinav Agarwal. All rights reserved.</span></div> Abhinav Agarwal tag:blogger.com,1999:blog-13714584.post-2501411967272936772 Fri Jun 30 2017 03:09:00 GMT-0400 (EDT) Oracle BI Cloud Service (BICS) Access Options: Data Sync Overview and Configuration http://blog.performancearchitects.com/wp/2017/06/28/oracle-bi-cloud-service-bics-access-options-data-sync-overview-and-configuration/ <p>Author: Doug Ross, Performance Architects<strong> </strong></p> <p><strong>Introduction</strong></p> <p>As more organizations move their business intelligence (BI) environments to the cloud, loading and accessing enterprise data will become as important as the front-end visualizations.  <a href="https://cloud.oracle.com/business_intelligence">Oracle&#8217;s BI Cloud Service (BICS)</a> offers several options for those data requirements that go beyond simple data upload. Each has a specific purpose, features, benefits, and limitations. One of the more powerful options for true enterprise-level data transfer to the cloud is Oracle’s <a href="http://www.oracle.com/technetwork/middleware/bicloud/downloads/index.html">Data Sync</a> tool.</p> <p><strong>Data Sync Overview</strong></p> <p>Data Sync provides a full-featured data transfer tool with a client interface that allows for scheduling load jobs that efficiently move data from flat files, database tables, and other cloud data sources into the <a href="https://docs.oracle.com/cloud/latest/dbcs_schema/index.html">BICS Database Schema Service</a> or <a href="https://cloud.oracle.com/database">Oracle Database Cloud Service</a>.  It can also directly load data as a data set source for the <a href="https://docs.oracle.com/cloud/latest/reportingcs_use/BILUG/GUID-7DC34CA8-3F7C-45CF-8350-441D8D9898EA.htm#BILUG-GUID-7DC34CA8-3F7C-45CF-8350-441D8D9898EA">Visual Analyzer</a> projects that are available in BICS.</p> <p>It includes many of the features found in other data loading tools: logging of load job execution steps, restarting after failures, incremental loading of new or modified data, and configuring the sequence of load operations.  A Data Sync solution would more likely lend itself either to new development of data load processes or to a more agile analytics environment that allowed for changing processes and data models more rapidly than would be possible with an on-premise database.</p> <p><strong>Data Sync Configuration Steps</strong></p> <p>Data Sync’s primary function is to upload data into a BICS environment.  Data can be loaded from flat files (CSV or XLSX), relational database sources (either tables, views, or SQL statements that are executed dynamically), <a href="http://www.oracle.com/us/products/applications/fusion/hcm-fusion-transactional-bi-1543884.pdf">Oracle Transactional BI (OTBI)</a>, <a href="https://en.wikipedia.org/wiki/Java_Database_Connectivity">JDBC</a> data sources (e.g., <a href="https://www.mongodb.com/">MongoDB</a>, <a href="https://en.wikipedia.org/wiki/Cloudera_Impala">Impala</a>, <a href="http://www.salesforce.com/">SalesForce</a>, <a href="https://aws.amazon.com/redshift/">RedShift</a>), or the <a href="https://cloud.oracle.com/service-cloud">Oracle Service Cloud</a>.  Uploaded data can be stored in cloud-based tables or data sets accessible to the data visualization components.</p> <p>Data Sync can:</p> <ul> <li>Load data sources other than Oracle in addition to data files or Oracle tables</li> <li>Execute incremental data loads or rolling deletes and insert / append strategies</li> <li>Merge data from multiple sources</li> <li>Schedule data loads</li> </ul> <p>Data Sync is installed on a local computer running either the Windows or Linux operating systems.  Prior to installing Data Sync, ensure that Java Development Kit (JDK) 1.7 or later is installed on the local computer.  It must be the JDK and not a JRE.  It is also necessary to validate that the user account that will be used to access the BICS database schema has the proper permission.   Work with your cloud administrator to request permission to upload data to Oracle BICS by assigning the BI Data Load Author application role to the account.   To upload data to a data set instead of a table, the BI Advanced Content Author application role should be assigned.</p> <p>Installation Steps:</p> <ol> <li>Download the Data Sync software from Oracle Technology Network. Currently located at: <a href="http://www.oracle.com/technetwork/middleware/bicloud/downloads/index.html">Data Sync Download on OTN</a></li> <li>Unzip BICSDataSync.Zip to a local directory (no spaces in directory name)</li> <li>Set the JAVA_HOME variable in config.bat or config.sh to point to JDK location</li> <li>Copy database-specific JDBC drivers to Data Sync’s \lib directory</li> </ol> <p>Data Sync is comprised of both a server component and a client GUI interface.  To start Data Sync and its server component, run datasync.bat (Windows) or datasync.sh (Linux/UNIX) from the Data Sync installation directory. The Data Sync icon displays in the system icon tray to indicate that the server is up and running.</p> <p>To access the client interface, click on the icon and choose Start UI to open the Data Sync client.</p> <p>Click on the icon and choose Exit to stop the Data Sync server.</p> <p><strong>Data Sync Updates</strong></p> <p>Data Sync automatically checks against the cloud environment to see if a new version is available prior to each load job executing.  It is possible that a new version of Data Sync has been installed in the cloud that is incompatible with the local version.  If the versions match, the data load continues unaffected.  If the minor version of the tool is changed in the cloud, it indicates a patch is available.  This triggers a one-time alert and an optional email if the Data Sync server is configured for emailing notifications.   If the version change indicates a major version release, an alert is created and an email sent. The data load job is stopped and will not run until the new version is installed.</p> <p>New versions (minor or major) are simple to install by following the standard installation process into a new home directory and then by copying the existing environment configuration into the new installation.</p> <p><strong>Data Sync Terminology</strong></p> <p>Connection:      Defines the data sources and target databases.</p> <p>Project:             A container which describes and organizes the information related to a data load.<br /> There can be any number of projects within a Data Sync instance.</p> <p>Job:                  The mechanism for uploading all the data sources defined in a project to BICS.</p> <p><strong>Load Strategies</strong></p> <p>When moving data to the cloud, a load strategy defines how that data is loaded from the data source into the target. Incremental data loading is available as long as there is a column on the source table which uniquely identifies each row (primary) as well as another column with a “Date/Timestamp” data type that can be used to identify which rows have been added or modified since the previous load.   During the initial load job, the full source table is transmitted to the cloud.  In subsequent loads, the last update date is compared to a stored timestamp for</p> <p>Load strategy options include:</p> <ul> <li>Replace data in table: Truncates existing data and always reloads the table. Any indexes are dropped prior to data load and are recreated after the load</li> <li>Append data to table: New data is added to the table without checking for any prior existence of data.  Indexes are not dropped before the load.  Any new indexes defined on the source are created after the data load</li> <li>Update table (add new records only): Requires a primary key column. If a row with the same key does not exist, then it is inserted, or else the record is ignored</li> <li>Update table (update existing records): Requires a primary key column.  If the data with the same key is available, then it is updated, or else it is ignored</li> </ul> <p><strong>Loading Data Sets</strong></p> <p>Data sets are separate data objects used primarily by the Visual Analyzer component of BICS.  They can be thought of as separate data files stored in the cloud.   Data Sync can load those data sets with the following guidelines:</p> <ul> <li>If a data set by a similar name does not exist, one is created automatically with default settings</li> <li>All string and timestamp based source columns are set to attribute type, and numeric datatype columns are set as measures in the target data set</li> <li>The maximum data set size is 50MB; data uploads fail if the data set exceeds the 50MB limit</li> </ul> <p><strong>Loading Data with Jobs</strong></p> <p>Data Sync uses the concept of a job to organize, schedule, and execute load processes. A run is an instance of a data loading job. For example, if you run a job twice, then you’ll see two run records on the History tab.  As is common with most scheduling applications, Data Sync allows for job settings to recur on a scheduled basis to meet whatever load frequency is required.</p> <p><strong>Parameterized Jobs</strong></p> <p>Parameters are available in Data Sync to customize the data loads at run time to dynamically change the conditions on the data selection step.  Rather than changing the SQL statements used to query the data sources, a parameter can be changed at the project or job level instead. A job level parameter will override a parameter with the same name at the project level.</p> <p><strong>Chaining Data Sync Load Jobs</strong></p> <p>There may be occasions where it would be beneficial to the load process to chain load jobs in a specific order. Possible reasons might be the need to load multiple separate data source tables into a common target or the need to load aggregate tables after base level data tables have completed.   The first step would be to separate the load jobs into distinct projects.</p> <p>When a job starts, a signal file with a name like “job_name”_StartSignal.txt is created in the Data Sync:  log\jobSignal directory for each run of the job. A file with the naming pattern “job_name”_CompletedSignal.txt is created when the job completes successfully, or “job_name”_FailedSignal.txt when the job fails.  Data Sync has a polling mechanism to look for the existence of these files before executing another load job.</p> <p>By editing the on_demand_job.xml file located in the conf-shared directory, you can specify the name of a file that will trigger a specific load job.</p> <p>For example:</p> <p>&lt;TriggerFile job=&#8221;Load_HR_Data&#8221; file=&#8221;d:\Load_The_HR_Data_Now.txt&#8221;/&gt;</p> <p>In the example above, Data Sync polls for the presence of the Load_The_Hr_Data_Now.txt file, and when it is found it triggers the execution of the Load_HR_Data job.  When the job is started, the triggering file is deleted.   A time window for polling can also be specified in the XML file.</p> <p>&lt;TriggerFile job=&#8221;Load_HR_Data&#8221; file=&#8221;d:\Load_The_HR_Data_Now.txt&#8221;&gt;<br /> &lt;TimeWindow startTime=&#8221;00:00&#8243; endTime=&#8221;05:00&#8243;/&gt;<br /> &lt;TimeWindow startTime=&#8221;21:00&#8243; endTime=&#8221;23:00&#8243;/&gt;<br /> &lt;/TriggerFile&gt;</p> <p>Two other options exist for triggering Data Sync load jobs, either from the command line using the datasyncCmdLine.bat/.sh script file or else with polling tables stored in a database.  Both of these methods are described in detail in the Data Sync documentation.</p> <p><strong>Monitoring and Troubleshooting Data Loads</strong></p> <p>Data load jobs can be monitored while in progress using the Current Jobs tab or History tab in Data Sync.   The top panel shows the job summary information. The Tasks tab underneath that panel shows one record per user-defined data flow. The tasks and the details show important metrics, including start and end timestamps, the number of rows read and written and throughput.</p> <p>The run log files are stored in the Data Sync log directory. One directory per run is created with a naming convention of CR_&lt;Table/File Name&gt;_&lt;From Connection&gt;_&lt;To Connection&gt;.&lt;Timestamp&gt;.log.</p> <p>There are options available within the Connections configuration screens to reset a data target to do a full reload if necessary.  The reload can be for all connections or individual tables within a connection.</p> <p><strong>Data Sync Configuration Migration</strong></p> <p>As with most development environments, it may be necessary to maintain multiple instances of Data Sync for DEV, TEST, and PROD.  This can be managed within the Data Sync interface by exporting the metadata from one environment and importing it into the next.  The export and import options are available under the Tools menu.</p> <p>When exporting, there are options to export the following metadata types:</p> <ul> <li>Logical: Exports all information contained in the Project view</li> <li>System: Exports all information contained in the Connections view, except passwords for servers and database connections</li> <li>Run Time: Exports information about jobs and schedules contained in the Jobs view</li> <li>User Data: Exports users, roles, and passwords</li> </ul> Melanie Mathews http://blog.performancearchitects.com/wp/?p=2034 Wed Jun 28 2017 05:13:21 GMT-0400 (EDT) How the EU GDPR will affect the use of Machine Learning - Part 1 http://www.oralytics.com/2017/06/how-eu-gdpr-will-affect-use-of-machine.html <p>On 5 December 2015, the European Parliament, the Council and the Commission reached agreement on the new data protection rules, establishing a modern and harmonised data protection framework across the EU. Then on 14th April 2016 the Regulations and Directives were adopted by the European Parliament.</p> <p><img src="https://lh3.googleusercontent.com/-ZLcZG-9tDp4/WVDzoofVrYI/AAAAAAAAMLU/JmyxYg0ClII2nrBpcxxouQFtEOSfWrR-QCHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="300" height="100" /></p> <p>The <a href="http://ec.europa.eu/justice/data-protection/reform/files/regulation_oj_en.pdf">EU GDPR</a> comes into effect on the 25th May, 2018.</p> <p>Are you ready ?</p> <p>The EU GDPR will affect every country around the World. As long as you capture and use/analyse data captured with the EU or by citizens in the EU then you have to comply with the EU GDPR.</p> <p>Over the past few months we have seen a increase in the amount of blog posts, articles, presentations, conferences, seminars, etc being produced on how the EU GDPR will affect you. Basically if your company has not been working on implementing processes, procedures and ensuring they comply with the regulations then you a bit behind and a lot of work is ahead of you.</p> <p>Like I said there was been a lot published and being talked about regarding the EU GDPR. Most of this is about the core aspects of the regulations on protecting and securing your data. But very little if anything is being discussed regarding the use of machine learning and customer profiling.</p> <p>Do you use machine learning to profile, analyse and predict customers? Then the EU GDPRs affect you.</p> <p>Article 22 of the EU GDPRs outlines some basic capabilities regarding machine learning, and in additionally Articles 13, 14, 19 and 21. </p> <p>Over the coming weeks I will have the following blog posts. Each of these address a separate issue, within the EU GDPR, relating to the use of machine learning.</p> <ul> <li><a href="http://www.oralytics.com/2017/07/part-2-do-i-have-permissions-to-use.html">Part 2 - Do I have permissions to use the data for data profiling?</a></li> <li><a href="http://www.oralytics.com/2017/07/part-3-ensuring-there-is-no.html">Part 3 - Ensuring there is no Discrimination in the Data and machine learning models.</a></li> <li><a href="http://www.oralytics.com/2017/07/part-4a-article-22-profiling-why-me-and.html">Part 4a - (Article 22: Profiling) Why me? and how Oracle 12c saves the day</a></li> <li><a href="http://www.oralytics.com/2017/07/part-4b-article-22-profiling-why-me-and.html">Part 4b - (Article 22: Profiling) Why me? and how Oracle 12c saves the day</a></li> <li><a href="http://www.oralytics.com/2017/07/part-5-right-to-be-forgotten-eu-gdprs.html">Part 5 - The right to be forgotten (EU GDPR)</a></li></ul> <img src="https://lh3.googleusercontent.com/-o8u2g2rr_3w/WVJnVRYK00I/AAAAAAAAMLk/ZgkIAGXNwvQhhqVPsg2RG3TDGCBjNZhfACHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="103" height="74" /> Brendan Tierney tag:blogger.com,1999:blog-4669933501315263808.post-3166577299140810418 Tue Jun 27 2017 10:11:00 GMT-0400 (EDT) Exploring the Rittman Mead Insights Lab http://www.rittmanmead.com/blog/2017/06/exploring-the-rittman-mead-insights-lab/ <h1 id="whatisourinsightslab">What is our Insights Lab?</h1> <p>The Insights Lab offers on-demand access to an experienced data science team, using a mature methodology to deliver one-off analyses and production-ready predictive models.</p> <p>Our Data Science team includes physicists, mathematicians, industry veterans and data engineers ready to help you take analytics to the next level while providing expert guidance in the process.</p> <h1 id="whyuseit">Why use it?</h1> <p>Data is cheaper to collect and easier to store than ever before. But collecting the data is not synonymous with getting value from it. Businesses need to do more with the same budget and are starting to look into machine learning to achieve this. </p> <p>These processes can take off some of the workload, freeing up people's time to work on more demanding tasks. However, many businesses don't know how to get started down this route, or even if they have the data necessary for a predictive model. </p> <h1 id="r">R</h1> <p>Our Data science team primarily work using the R programming language. R is an open source language which is supported by a large community.</p> <p>The functionality of R is extended by many community written packages which implement a wide variety of statistical and graphical techniques, including linear and nonlinear modeling, statistical tests, time-series analysis, classification, clustering as well as packages for data access, cleaning, tidying, analysing and building reports. </p> <p>All of these packages can be found on the Comprehensive R Archive Network (CRAN), making it easy to get access to new techniques or functionalities without needing to develop them yourself (all the community written packages work together).</p> <p>R is not only free and extendable, it works well with other technologies and makes it an ideal choice for businesses who want to start looking into advanced analytics. Python is an obvious alternative, and several of our data scientists prefer it. We're happy to use whatever our client's teams are most familiar with.</p> <p>Experienced programmers will find R syntax easy enough to pick up and will soon be able to implement some form of machine learning. However, for a detailed introduction to R and a closer look at implementing some of the concepts mentioned below we do offer a training course in <a href="https://www.rittmanmead.com/advanced-analytics-oracle-r/">R</a>.</p> <h1 id="ourmethodology">Our Methodology</h1> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Untitled-Diagram-2.png" alt=""></p> <h2 id="define">Define</h2> <h3 id="defineaquestion">Define a Question</h3> <p>Analytics, for all intents and purposes, is a scientific discipline and as such requires a hypothesis to test. That means having a specific question to answer using the data.</p> <p>Starting this process without a question can lead to biases in the produced result. This is called data dredging - testing huge numbers of hypotheses about a single data set until the desired outcome is found. Many other forms of bias can be introduced accidentally; the most commonly occurring will be outlined in a future blog post. </p> <p>Once a question is defined, it is also important to understand which aspects of the question you are most interested in. Associated, is the level of uncertainty or error that can be tolerated if the result is to be applied in a business context.</p> <p>Questions can be grouped into a number of types. Some examples will be outlined in a future blog post. </p> <h3 id="defineadataset">Define a dataset</h3> <p>The data you expect to be relevant to your question needs to be collated. Maybe supplementary data is needed, or can be added from different databases or web scraping.</p> <p>This data set then needs to be cleaned and tidied. This involves merging and reshaping the data as well as possibly summarising some variables. For example, removing spaces and non-printing characters from text and converting data types.</p> <p>The data may be in a raw format, there may be errors in the data collection, or corrupt or missing values that need to be managed. These records can either be removed completely or replaced with reasonable default values, determined by which makes the most sense in this specific situation. If records are removed you need to ensure that no selection biases are being introduced. </p> <p>All the data should be relevant to the question at hand, anything that isn't can be removed. There may also be external drivers for altering the data, such as privacy issues that require data to be anonymised. </p> <p>Natural language processing could be implemented for text fields. This takes bodies of text in human readable format such as emails, documents and web page content and processes it into a form that is easier to analyse.</p> <p>Any changes to the dataset need to be recorded and justified.</p> <h2 id="model">Model</h2> <h3 id="exploratoryanalysis">Exploratory Analysis</h3> <p>Exploratory data analysis involves summarising the data, investigating the structure, detecting outliers / anomalies as well as identifying patterns and trends. It can be considered as an early part of the model production process or as a preparatory step immediately prior. Exploratory analysis is driven by the data scientist, enabling them to fully understand the data set and make educated decisions; for example the best statistical methods to employ when developing a model.</p> <p>The relationships between different variables can be understood and correlations found. As the data is explored, different hypotheses could be found that may define future projects. </p> <p>Visualisations are a fundamental aspect of exploring the relationships in large datasets, allowing the identification of structure in the underlying dataset.</p> <p>This is also a good time to look at the distribution of your dataset with respect to what you want to predict. This often provides an indication of the types of models or sampling techniques that will work well and lead to accurate predictions. </p> <p>Variables with very few instances (or those with small variance) may not be beneficial, and in some cases could even be detrimental, increasing computation time and noise. Worse still, if these instances represent an outlier, significant (and unwarranted) value may be placed on these leading to bias and skewed results. </p> <h3 id="statisticalmodellingprediction">Statistical Modelling/Prediction</h3> <p>The data set is split into two sub groups, "Training" and "Test". The training set is used only in developing or "training" a model, ensuring that the data it is tested on (the test set) is unseen. This means the model is tested in a more realistic context and will help to determine whether the model has overfitted to the training set. i.e. is fitting random noise in addition to any meaningful features.</p> <p>Taking what was learned from the exploratory analysis phase, an initial model can be developed based on an appropriate application of statistical methods and modeling tools. There are many different types of model that can be applied to the data, the best tends to depend on the complexity of your data and the any relationships that were found in the exploratory analysis phase. During training, the models are evaluated in accordance with an appropriate metric, the improvement of which is the "goal" of the development process. The predictions produced from the trained models when run on the test set will determine the accuracy of the model (i.e. how closely its predictions align with the unseen real data).</p> <p>A particular type of modelling method, "machine learning" can streamline and improve upon this somewhat laborious process by defining models in such a way that they are able to self optimise, "learning" from past iterations to develop a superior version. Broadly, there are two types, supervised and un-supervised. A supervised machine learning model is given some direction from the data scientist as to the types of methods that it should use and what it is expecting. Unsupervised machine learning on the other hand, as the name suggests, involves giving the model less information to start with and letting it decide for its self what to value, and how to approach the problem. This can help to remove bias and reduce the number of assumptions made but will be more computationally intensive, as the model has a broader scope to investigate. Usually supervised machine learning is employed in a case where the problem and data set are reasonably well understood, and unsupervised machine learning where this is not the case. </p> <p>Complex predictive modelling algorithms perform feature importance and selection internally while constructing models. These models can also report on the variable importance determined during the model preparation process.</p> <h3 id="peerreview">Peer Review</h3> <p>This is an important part of any scientific process, and effectively utilities our broad expertise in modelling at Rittman Mead. This enables us to be sure no biases were introduced that could lead to a misleading prediction and that the accuracy of the models is what could be expected if the model was run on new unseen data. Additional expert views can also lead to alternative potential avenues of investigation being identified as part of an expanded or subsequent study.</p> <h2 id="deploy">Deploy</h2> <h3 id="report">Report</h3> <p>For a scientific investigation to be credible the results must be reproducible. The reports we produce are written in R markdown and contain all the code required to reproduce the results presented. This also means it can be re-run with new data as long as it is of the same format. A clear and concise description of the investigation from start to finish will be provided to ensure that justification and context is given for all decisions and actions.</p> <h3 id="delivery">Delivery</h3> <p>If the result is of the required accuracy we will deploy a model API enabling customers to start utilising it immediately. <br> There is always a risk however that the data does not contain the required variables to create predictions with sufficient confidence for use. In these cases, and after the exploratory analysis phase there may be other questions that would be beneficial to investigate. This is also a useful result, enabling us to suggest additional data to collect that may allow a more accurate result should the process be repeated later. </p> <h2 id="support">Support</h2> <p>Following delivery we are able to provide a number of support services to ensure that maximum value is extracted from the model on an on-going basis. These include: <br> - Monitoring performance and accuracy against the observed, actual values over a period of time. Should there be discrepancies between these values arise, these can be used to identify the need for alterations to the model. <br> - Exploring specific exceptions to the model. There may be cases in which the model consistently performs poorly. Instances like these may not have existed in the training set and the model could be re-trained accordingly. If they were in the training set these could be weighted differently to ensure a better accuracy, or could be represented by a separate model. <br> - Updates to the model to reflect discrepancies identified through monitoring, changes of circumstance, or the availability of new data. <br> - Many problems are time dependent and so model performance is expected to degrade, requiring retraining on more up to date data. </p> <h1 id="summary">Summary</h1> <p>In conclusion our Insights lab has a clearly defined and proven process for data science projects that can be adapted to fit a range of problems.</p> <p>Contact us to learn how Insights Lab can help your organization get the most from its data, and schedule your consultation today. <br> Contact us at <a href="mailto:info+insights@rittmanmead.com?subject=InsightsLab">info@rittmanmead.com</a></p> Hannah Patrick 2f71e121-2bb6-41e8-8f42-4cb837d56352 Tue Jun 27 2017 09:58:08 GMT-0400 (EDT) Data Redaction New Features in Oracle 12c Release 2 https://gavinsoorma.com/2017/06/data-redaction-new-features-in-oracle-12c-release-2/ <p><strong>Data Redaction</strong> was introduced in Oracle Database 12c Release 1 <a href="https://gavinsoorma.com/2014/01/oracle-database-12c-new-feature-data-redaction"> (read a note on this)<br /> </a> where we could hide or mask sensitive data in tables  from non-privileged users. The data was &#8216;redacted&#8217; at query run time and was stored in the database in its normal non-redacted form. The redaction could </p><div class="mgm_private_no_access"><div style="border-style:solid; border-width:1px; margin-bottom:1em; background-color:#E4F2FD; border-color:#C6D9E9; margin:5px; font-family:'Lucida Grande','Lucida Sans Unicode',Tahoma,Verdana,sans-serif; font-size:13px; color:#333333;"> <div style="margin: 5px 10px;">You need to be logged in to see this part of the content. Please <a href="https://gavinsoorma.com/login/?redirect_to=https://gavinsoorma.com/2017/06/data-redaction-new-features-in-oracle-12c-release-2/"><b>Login</b></a> to access. </div> </div></div> Gavin Soorma https://gavinsoorma.com/?p=7695 Tue Jun 27 2017 01:08:10 GMT-0400 (EDT) Using Tableau to Show Variance and Uncertainty http://www.rittmanmead.com/blog/2017/06/using-tableau-to-show-variance-and-uncertainty/ <p>Recently, I watched an amazing <a href="https://openvisconf.com/#acox-video-item">keynote presentation from Amanda Cox</a> at OpenVis. Toward the beginning of the presentation, Amanda explained that people tend to feel and interpret things differently. She went on to say that, “There’s this gap between what you say or what you think you’re saying, and what people hear.”</p> <p>While I found her entire presentation extremely interesting, that statement in particular really made me think. When I view a visualization or report, am I truly understanding what the results are telling me? Personally, when I’m presented a chart or graph I tend to take what I’m seeing as absolute fact, but often there’s a bit of nuance there. When we have a fair amount of variance or uncertainty in our data, what are some effective ways to communicate that to our intended audience? </p> <p>In this blog I'll demonstrate some examples of how to show uncertainty and variance in Tableau. All of the following visualizations are made using Tableau Public so while I won’t go into all the nitty-gritty detail here, follow <a href="https://public.tableau.com/views/GCBC-Variance/VarianceandUncertaintyVisualizations">this link</a> to download the workbook and reverse engineer the visualizations yourself if you'd like.</p> <p>First things first, I need some data to explore. If you've ever taken our training you might recall the Gourmet Coffee &amp; Bakery Company (GCBC) data that we use for our courses. Since I’m more interested in demonstrating what we can do with the visualizations and less interested in the actual data itself, this sample dataset will be more than suitable for my needs. I'll begin by pulling the relevant data into Tableau using <a href="http://www.rittmanmead.com/unify">Unify</a>. </p> <p>If you haven't already heard about Unify, it allows Tableau to seamlessly connect to OBIEE so that you can take advantage of the subject areas created there. Now that I have some data, let’s look at our average order history by month. To keep things simple, I’ve filtered so that we’re only viewing data for Times Square.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Variance-and-Uncertainty-Visualizations-3.png" alt="Average Orders for 2015-2016"></p> <p>On this simple visualization we can already draw some insights. We can see that the data is cyclical with a peak early in the year around February and another in August. We can also visually see the minimum number of orders in a month appears to be about 360 orders while the maximum is just under 400 orders.</p> <p>When someone asks to see “average orders by month”, this is generally what people expect to see and depending upon the intended audience a chart like this might be completely acceptable. However, when we display aggregated data we no longer have any visibility into the variance of the underlying data.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Variance-and-Uncertainty-Visualizations--1--2.png" alt="Daily Orders"></p> <p>If we display the orders at the day level instead of month we can still see the cyclical nature of the data but we also can see additional detail and you’ll notice there’s quite a bit more “noise” to the data. We had a particularly poor day in mid-May of 2014 with under 350 orders. We’ve also had a considerable number of good days during the summer months when we cleared 415 orders.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Variance-and-Uncertainty-Visualizations--2--3.png" alt="Moving Average"></p> <p>Depending upon your audience and the dataset, some of these charts might include too much information and be too busy. If the viewer can’t make sense of what you’re putting in front of them there’s no way they’ll be able to discern any meaningful insights from the underlying dataset. Visualizations must be easy to read. One way to provide information about the volatility of the data but with less detail would be to use confidence bands, similar to how one might view stock data. In this example I’ve calculated and displayed a moving average, as well as upper and lower confidence bands using the 3rd standard deviation. Confidence bands show how much uncertainty there is in your data. When the bands are close you can be more confident in your results and expectations.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Variance-and-Uncertainty-Visualizations--3--4.png" alt="Orders by Month"> <img src="http://www.rittmanmead.com/blog/content/images/2017/06/Variance-and-Uncertainty-Visualizations--4--2.png" alt="Orders by Day"></p> <p>An additional option is the use of a scatterplot. The awesome thing about a scatterplots is that not only does it allow you to see the variance of your data, but if you play with the size of your shapes and tweak the transparency just right, you also get a sense of density of your dataset because you can visualize where those points lie in relation to each other. </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Variance-and-Uncertainty-Visualizations--5--2.png" alt="Boxplot"></p> <p>The final example I have for you is to show the distribution of your data using a boxplot. If you’re not familiar with boxplots, the line in the middle of the box is the median. The bottom and top of the box, known as the bottom and top hinge, give you the 25th and 75th percentiles respectively and the whiskers outside out the box show the minimum and maximum values excluding any outliers. Outliers are shown as dots.</p> <p>I want to take a brief moment to touch on a fairly controversial subject of whether or not to include a zero value in your axes. When you have a non-zero baseline it distorts your data and differences are exaggerated. This can be misleading and might lead your audience into drawing inaccurate conclusions. </p> <p><img src="https://accuweather.brightspotcdn.com/dims4/default/7fa9453/2147483647/resize/590x/quality/90/?url=http%3A%2F%2Faccuweather-bsp.s3.amazonaws.com%2F49%2F59%2F74d8009f4a8ba3e61ef12fbe52e0%2F2016-tornado-graph.S.%20Tornado%20Count%202013-2016.jpg" alt=""></p> <p>For example, a quick Google search revealed <a href="http://www.accuweather.com/en/weather-news/experts-explain-why-the-us-saw-a-low-tornado-count-in-2016/70000149">this image on Accuweather</a> showing the count of tornados in the U.S. for 2013-2016. At first glance it appears as though there were almost 3 times more tornados in 2015 than in 2013 and 2014, but that would be incorrect.</p> <p>On the flipside, there are cases where slight fluctuations in the data are extremely important but are too small to be noticed when the axis extends to zero. Philip Bump did an excellent job demonstrating this in his <a href="https://www.washingtonpost.com/news/the-fix/wp/2015/12/14/why-the-national-reviews-global-temperature-graph-is-so-misleading/">"Why this National Review global temperature graph is so misleading"</a> article in the The Washington Post.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screen-Shot-2017-06-02-at-11.16.21-AM.png" alt=""></p> <p>Philip begins his article with this chart tweeted by the National Review which appears to prove that global temperatures haven’t changed in the last 100 years. As he goes on to explain, this chart is misleading because of the scale used. The y-axis stretches from -10 to 110 degrees making it impossible to see a 2 degree increase over the last 50 years or so.</p> <p>The general rule of thumb is that you should always start from zero. In fact, when you create a visualization in Tableau, it includes a zero by default. Usually, I agree with this rule and the vast majority of the time I do include a zero, but I don’t believe there can be a hard and fast rule as there will always be an exception. Bar charts are used to communicate absolute values so the size of that bar needs to be proportional to the overall value. I agree that bar charts should extend to zero because if it doesn’t we distort what the data is telling us. With line charts and scatterplots we tend to look at the positioning of the data points relative to each other. Since we’re not as interested in the value of the data, I don’t feel the decision to include a zero or not is as cut and dry.</p> <p>The issue boils down to what it is you’re trying to communicate with your chart. In this particular case, I’m trying to highlight the uncertainty so the chart needs to draw attention to the range of that uncertainty. For this reason, I have not extended the axes in the above examples to zero. You are free to disagree with me on this, but as long as you’re not intentionally misleading your audience I feel that in instances such as these this rule can be relaxed.</p> <p>These are only a few examples of the many ways to show uncertainty and variance within your data. Displaying the volatility of the data and giving viewers a level of confidence in the results is immensely powerful. Remember that while we can come up with the most amazing visualizations, if the results are misleading or misinterpreted and users draw inaccurate conclusions, what’s the point? </p> Jason Baer def67bed-6823-488a-a379-d08e50e9fa1a Mon Jun 26 2017 10:00:00 GMT-0400 (EDT) PGX – Parallel Graph AnalytiX : the Oracle graph analysis brain https://gianniceresa.com/2017/06/pgx-oracle-graph-analysis-brain/ <p>Graph databases: who hasn&#8217;t heard, so far, these words?<br /> The newest pretty shiny tool for data scientist, the latest addition to the analytical toolbox after big data solutions few years ago.</p> <p>So far it seems to still be a niche solution, too new to be widely adopted. Which also means it&#8217;s the perfect timing to take this train and not miss it.<br /> I jumped on that train &#8220;seriously&#8221; in the last weeks and here, and in some future posts, you will find my findings. This is just a quick intro on what will make my work with graphs possible, the graph &#8220;brain&#8221;.</p> <h2>Why graph databases?</h2> <p>You can for sure do the same kind of analysis a graph database allows you to perform on your relational database. But you will have to write tons of code and it will, probably, perform quite badly.</p> <p>Relational databases are excellent for their job, and nobody is saying relational is dead. But you can&#8217;t expect they keep being excellent for analysis or activities where other technologies / models would fit better. Same story with cubes: sure, you can store in your database data and perform analysis on it, but for some activities you will never outperform or even get close to a good Essbase cube.</p> <p>Same for graph databases engines: they are optimized for performing analysis on graphs and managing data based on graphs composed by nodes (aka vertex) and edges each one having properties and labels.</p> <div id="attachment_498" style="width: 310px" class="wp-caption aligncenter"><img class="wp-image-498 size-medium" src="https://gianniceresa.com/wp-content/uploads/2017/06/pgy_sample_graph-300x130.jpg" alt="PGX: the brain of graph analysis - sample graph" width="300" height="130" srcset="https://gianniceresa.com/wp-content/uploads/2017/06/pgy_sample_graph-300x130.jpg 300w, https://gianniceresa.com/wp-content/uploads/2017/06/pgy_sample_graph.jpg 600w" sizes="(max-width: 300px) 100vw, 300px" /><p class="wp-caption-text">A sample property graph</p></div> <p>There are multiple engines available on the market to store and manipulate graphs. I admit I just knew Neo4j and I didn&#8217;t search longer as Neo4j can easily be used to get started with a graph and get your hands dirty playing with graphs.</p> <blockquote><p><pre class="crayon-plain-tag">docker run -d -p 7474:7474 -p 7687:7687 -P --name neo4j neo4j:latest</pre></p> <p>Open a browser and connect to http://&lt;docker host&gt;:7474 and there you are!<br /> Follow instructions on screen and enjoy graphs. Guided tutorials and a web visualization of your graph make it easy to get started with.</p></blockquote> <h2>Oracle graph solutions</h2> <p>Oracle joined the party with their property graph solution: PGX, the acronym of Parallel Graph AnalytiX.<br /> Actually, PGX can be a standalone graph tool, but in the current situation it is more the brain of the graph implementations Oracle used in various tools. It&#8217;s the &#8220;tool&#8221; performing operations on graphs in-memory, but doesn&#8217;t provide storage directly. Storage (read and write) is provided by external solutions.<br /> The description of what it does sound great, the documentation is nicely written with lot of examples.</p> <blockquote><p><strong>What is PGX?</strong><br /> PGX is a toolkit for graph analysis &#8211; both running algorithms such as PageRank against graphs, and performing SQL-like pattern-matching against graphs, using the results of algorithmic analysis. Algorithms are parallelized for extreme performance. The PGX toolkit includes both a single-node in-memory engine, and a distributed engine for extremely large graphs. Graphs can be loaded from a variety of sources including flat files, SQL and NoSQL databases and Apache Spark and Hadoop; incremental updates are supported.<br /> (<a href="http://www.oracle.com/technetwork/oracle-labs/parallel-graph-analytix/overview/index.html" target="_blank" rel="noopener">http://www.oracle.com/technetwork/oracle-labs/parallel-graph-analytix/overview/index.html</a>)</p></blockquote> <div id="attachment_490" style="width: 500px" class="wp-caption aligncenter"><img class="size-full wp-image-490" src="https://gianniceresa.com/wp-content/uploads/2017/06/pgx_overview.png" alt="PGX overview" width="490" height="302" srcset="https://gianniceresa.com/wp-content/uploads/2017/06/pgx_overview.png 490w, https://gianniceresa.com/wp-content/uploads/2017/06/pgx_overview-300x185.png 300w" sizes="(max-width: 490px) 100vw, 490px" /><p class="wp-caption-text">PGX overview</p></div> <p>As you can see from the generic structure of PGX, it&#8217;s a client-server &#8220;kind of&#8221; solution where the client will interact with the PGX engine (the server) and this one will then, if required, interact with different kind of storage to load or store graphs.<br /> You can also build an ephemeral graph, on the fly, from the client and use it for the required analysis in memory and never store it anywhere.</p> <h3>PGX clients</h3> <p>Multiple clients already exist: PGX shell, java, javascript, python, Zeppelin notebook.<br /> This list will grow in the future as the exposed API and REST interface can easily be used by other languages or tools.</p> <p>PGX shell is probably the most complete and native one followed by the Java API. The python module (still didn&#8217;t find a link to download and install it directly, but you can find it in PGX embedded in Database 12cR2) seems to be using the Java API, so all the same functionalities can be exposed, even if the current status is maybe more limited.<br /> The PGX Zeppelin interpreter is still kind of work in progress: if PGX isn&#8217;t local but connecting to a remote instance some functions do not work anymore (there isn&#8217;t a full support of the shell functions when connected remotely via Zeppelin).</p> <h3>PGX data sources</h3> <p>The list of supported sources so far seems to be: flat files (filesystem), SQL (database), NoSQL, Spark and HDFS.<br /> Here the issues starts as apparently not all the sources are available in all the PGX distributions.</p> <p>So far, I used the flat files and SQL loading from the database: globally worked fine. I&#8217;m probably not going to look into NoSQL, Spark and HDFS support as I don&#8217;t have these tools on my Docker images.</p> <h2>Multiple version and distributions</h2> <p>If in theory and on paper PGX is all nice and cool, there are some issues &#8230;.<br /> So far there seems to be multiple different versions both in terms of version number and functionalities.</p> <p>In Oracle Database 12c Release 2 (12.2.0.1.0) you have PGX version 2.1.0 with support to source (load) graphs from the database or filesystem (based on the list of JAR files I saw).<br /> In Oracle Big Data Lite Virtual Machine 4.8 you will find PGX version 2.4.0 with support to source graphs from the filesystem, NoSQL and HDFS.<br /> If you download PGX from the OTN website you get version 2.4.1 with support to source graphs from filesystem apparently only.</p> <p>If you apply &#8220;Patch 25640325: MISSING PGQL FUNCTION IN ORACLE DATABASE RELEASE 12.2.0.1&#8221; to your 12c Release 2 database you will end up with PGX 2.4.0 and same sources for graphs: database and filesystem. The patch in addition to a newer version bring support for PGQL.</p> <p>To make it short: 3 versions, 3 different data sources = not easy to really test all the feature of PGX easily. Double check the documentation for notes on top of pages with limitations of which release support the functionality (mainly when related to graph loading).</p> <p>The version provided with Big Data Lite virtual machine must be the one named &#8220;Oracle Big Data Spatial and Graph&#8221; package, while the one delivered with database 12cR2 must be the one named &#8220;Oracle Spatial and Graph&#8221; package.</p> <p>Apparently, reading posts on OTN forum, I&#8217;m not the only one dreaming for PGX 2.5.0 merging the current versions and providing support for all the sources making it easier to test and compare options.<br /> I can understand and agree that licensing will be different and can justify support for different or limited sources, but the software must be developed as a single solution to guarantee compatibility and flexibility.</p> <h2>How to use it?</h2> <p>PGX can be used in multiple ways as you can see from the following picture I took from the doc.</p> <div id="attachment_493" style="width: 813px" class="wp-caption aligncenter"><img class="size-full wp-image-493" src="https://gianniceresa.com/wp-content/uploads/2017/06/pgx_usage_diagram.png" alt="PGX usage" width="803" height="663" srcset="https://gianniceresa.com/wp-content/uploads/2017/06/pgx_usage_diagram.png 803w, https://gianniceresa.com/wp-content/uploads/2017/06/pgx_usage_diagram-300x248.png 300w, https://gianniceresa.com/wp-content/uploads/2017/06/pgx_usage_diagram-768x634.png 768w" sizes="(max-width: 803px) 100vw, 803px" /><p class="wp-caption-text">Usage of PGX</p></div> <p>The simplest and quickest way is to use the PGX shell you get with PGX (./bin/pgx). If you take the OTN version all it takes is to unzip the file, meet the requirements (in the end JAVA mainly) and you are ready to start with the shell.</p> <blockquote><p><strong>How to exit the PGX shell?</strong></p> <p>It took me some time to find how to exit the PGX shell, all the classical &#8220;exit&#8221;, &#8220;quit&#8221;, &#8220;stop&#8221;, &#8220;please let me out&#8221; didn&#8217;t work &#8230;<br /> Finally found that <pre class="crayon-plain-tag">System.exit(0)</pre>  works fine for that.</p></blockquote> <p>In my case I decided to use <a href="https://zeppelin.apache.org/" target="_blank" rel="noopener">Apache Zeppelin</a> as there is a PGX interpreter provided by Oracle, and Zeppelin also support Python (by using the pyopg module), SQL (by using a JDBC interpreter) and few other things. This make Zeppelin a good way to document and test commands, because you can have a nice documentation using markdown and next to it code you execute on the fly.</p> <p>An extra argument justifying the usage or Zeppelin is the Oracle Lab Data Studio application which will come at some point in the coming months (like always with Oracle: not guaranteed) and which will support to import Zeppelin notebook. So, nothing will be lost &#8230;<br /> Must be noted that, out of the box, there is no visualization plugin available in Zeppelin so far. Oracle Lab Data Studio will provide that out of the box.</p> <p>You can of course get something similar by using <a href="http://jupyter.org/" target="_blank" rel="noopener">Jupyter</a>, another notebook application (python will work fine but didn&#8217;t look at porting the PGX interpreter from Zeppelin to Jupyter for now).</p> <p>So far, I have a setup with Zeppelin, Oracle Database 12cR2 and a PGX server running in three Docker containers and communicating together. It&#8217;s the closest to what a standard &#8220;enterprise&#8221; setup would look like I got.<br /> I&#8217;m still finalizing the setup and will write about it in a future post, maybe after the OTN PGX release will also support sourcing from the database. The Docker images will also be provided as they are extensions of existing images but pre-configured to work together (SSL certificates etc.).</p> <h2>Where to start? Documentation?</h2> <p>The documentation is really nicely done. Multiple tutorials making it simple to follow. Some examples and use cases. All the details about the API and how things work.</p> <p>It&#8217;s definitely the best place to get started with PGX: <a href="https://docs.oracle.com/cd/E56133_01/latest/index.html" target="_blank" rel="noopener">https://docs.oracle.com/cd/E56133_01/latest/index.html</a></p> <p>Stay tuned for more content about PGX and properties graphs in general, I&#8217;m going to work on this topic for quite some time&#8230;</p> <p>The post <a rel="nofollow" href="https://gianniceresa.com/2017/06/pgx-oracle-graph-analysis-brain/">PGX &#8211; Parallel Graph AnalytiX : the Oracle graph analysis brain</a> appeared first on <a rel="nofollow" href="https://gianniceresa.com">Gianni&#039;s world: things crossing my mind</a>.</p> Gianni Ceresa https://gianniceresa.com/?p=485 Mon Jun 26 2017 03:56:20 GMT-0400 (EDT) Kscope17 Conference Analytics Part 2: Software http://redpillanalytics.com/protected-kscope17-conference-analytics/ <p><img width="300" height="200" src="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?fit=300%2C200" class="attachment-medium size-medium wp-post-image" alt="" srcset="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?w=1400 1400w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?resize=300%2C200 300w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?resize=768%2C512 768w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?resize=1024%2C682 1024w" sizes="(max-width: 300px) 100vw, 300px" data-attachment-id="5213" data-permalink="http://redpillanalytics.com/protected-kscope17-conference-analytics/phone/" data-orig-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?fit=1400%2C933" data-orig-size="1400,933" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;1.8&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;Canon PowerShot G7 X&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;1496681464&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;8.8&quot;,&quot;iso&quot;:&quot;1000&quot;,&quot;shutter_speed&quot;:&quot;0.05&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;1&quot;}" data-image-title="phone" data-image-description="" data-medium-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?fit=300%2C200" data-large-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?fit=1024%2C682" /></p><h3>Editors Note:</h3> <p>This year, Red Pill Analytics is the Analytics Sponsor at ODTUG Kscope17. Our company motto is #challengeeverything &#8211; so we knew we wanted to do something different and unexpected while at the conference.<br /> What we eventually landed on was creating Analytics Stations using IoT technologies to show how an old school object, like a rotary phone, can be repurposed and turned into an interactive device.<br /> <a href="http://redpillanalytics.com/kscope17-analytics-hardware/">Part 1 focuses on hardware.</a><br /> <a href="http://redpillanalytics.com/protected-kscope17-conference-analytics/">Part 2 focuses on software.</a><br /> Kscope17 also used beacon technology to analyze conference attendee activities. Red Pill Analytics pulled that information through a REST API and told the story of Kscope17 using Oracle Data Visualization. This will be explained in Part 3, coming soon!</p> <p>&nbsp;</p> <hr /> <p>&nbsp;</p> <p>Because the project uses a Raspberry Pi Model 3B which uses Raspbian (a distribution of linux designed for the Raspberry Pi) all of our software used is run on Linux and on an ARM processor. The project primarily uses a framework called Electron (<a href="https://electron.atom.io/" target="_blank" rel="noopener noreferrer">https://electron.atom.io/</a>) for our logic and display code and our hardware interaction code.</p> <p>The first step to setting up the Raspberry Pi is to burn Raspbian (found on Raspberry Pi official website, <a href="https://www.raspberrypi.org/downloads/" target="_blank" rel="noopener noreferrer">https://www.raspberrypi.org/downloads/</a>) to a micro SD card which will be inserted into the Pi and act as the storage device and operating medium for the embedded device. I chose to burn the latest Raspbian image using a tool called Etcher (<a href="https://etcher.io/" target="_blank" rel="noopener noreferrer">https://etcher.io/</a>). The next step was to insert the micro SD card into the Pi, connect a screen via HDMI cable, connect a USB keyboard and mouse for initial setup, and connect a sufficiently specced power source.</p> <p>The Pi boots up and the first things I did were connect to my local Wi-Fi network, disable underscan so that the monitor output consumed all of the available screen space, enable SSH, set locale information (keyboard/timezone) to United States, and change the system password for security.</p> <div class="cols-wrapper cols-3"> <div class="col"> <p><img data-attachment-id="5178" data-permalink="http://redpillanalytics.com/protected-kscope17-conference-analytics/ec1/" data-orig-file="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC1.png?fit=504%2C429" data-orig-size="504,429" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="EC1" data-image-description="" data-medium-file="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC1.png?fit=300%2C255" data-large-file="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC1.png?fit=504%2C429" class="size-full wp-image-5178 aligncenter" src="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC1.png?resize=504%2C429" alt="" srcset="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC1.png?w=504 504w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC1.png?resize=300%2C255 300w" sizes="(max-width: 504px) 100vw, 504px" data-recalc-dims="1" /></p> </div> <div class="col"> <p><img data-attachment-id="5179" data-permalink="http://redpillanalytics.com/protected-kscope17-conference-analytics/ec2/" data-orig-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC2.png?fit=495%2C423" data-orig-size="495,423" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="EC2" data-image-description="" data-medium-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC2.png?fit=300%2C256" data-large-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC2.png?fit=495%2C423" class="size-full wp-image-5179 aligncenter" src="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC2.png?resize=495%2C423" alt="" srcset="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC2.png?w=495 495w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC2.png?resize=300%2C256 300w" sizes="(max-width: 495px) 100vw, 495px" data-recalc-dims="1" /></p> </div> <div class="col nomargin"> <p><img data-attachment-id="5180" data-permalink="http://redpillanalytics.com/protected-kscope17-conference-analytics/ec3/" data-orig-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC3.png?fit=496%2C424" data-orig-size="496,424" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="EC3" data-image-description="" data-medium-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC3.png?fit=300%2C256" data-large-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC3.png?fit=496%2C424" class="size-full wp-image-5180 aligncenter" src="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC3.png?resize=496%2C424" alt="" srcset="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC3.png?w=496 496w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC3.png?resize=300%2C256 300w" sizes="(max-width: 496px) 100vw, 496px" data-recalc-dims="1" /></p> </div> </div> <div class="cols-wrapper cols-3 cols-default"> <div class="col"> <p>Locale settings</p> </div> <div class="col"> <p>Enabling SSH for SFTP</p> </div> <div class="col nomargin"> <p>Disable underscan</p> </div> </div> <p>Next was to transfer the electron application files from my development machine to the Pi via Filezilla and SFTP (note that in order to do this SSH must first be enabled via raspi-config), install the node/electron dependencies and test the application. I installed node with <code>curl -sL https://deb.nodesource.com/setup_8.x | sudo -E bash -</code> followed by <code>sudo apt install nodejs</code>.</p> <p>After verifying the application worked as expected, I setup the node/electron app to run at Pi startup so it could act appliance like. To do this I add a single line <code>@sh /home/pi/Desktop/phone.sh</code> to the following file <code>/home/pi/.config/lxsession/LXDE-pi/autostart</code> and then create the following script file on the desktop on my Pi.</p> <div id="crayon-594bf2ad15997805995683" class="crayon-syntax crayon-theme-arduino-ide crayon-font-monaco crayon-os-mac print-yes notranslate crayon-wrapped" data-settings=" minimize scroll-mouseover wrap"> <div class="crayon-toolbar" data-settings=" show"> <p><img data-attachment-id="5210" data-permalink="http://redpillanalytics.com/protected-kscope17-conference-analytics/screen-shot-2017-06-23-at-3-30-10-pm/" data-orig-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/Screen-Shot-2017-06-23-at-3.30.10-PM.png?fit=1350%2C178" data-orig-size="1350,178" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="Screen Shot 2017-06-23 at 3.30.10 PM" data-image-description="" data-medium-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/Screen-Shot-2017-06-23-at-3.30.10-PM.png?fit=300%2C40" data-large-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/Screen-Shot-2017-06-23-at-3.30.10-PM.png?fit=1024%2C135" class="alignnone size-full wp-image-5210" src="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/Screen-Shot-2017-06-23-at-3.30.10-PM.png?resize=1170%2C154" alt="" srcset="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/Screen-Shot-2017-06-23-at-3.30.10-PM.png?w=1350 1350w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/Screen-Shot-2017-06-23-at-3.30.10-PM.png?resize=300%2C40 300w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/Screen-Shot-2017-06-23-at-3.30.10-PM.png?resize=768%2C101 768w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/Screen-Shot-2017-06-23-at-3.30.10-PM.png?resize=1024%2C135 1024w" sizes="(max-width: 1170px) 100vw, 1170px" data-recalc-dims="1" /></p> <p>Lastly I needed to disable the cursor and the blank screensaver so the phone could act like a kiosk and stay awake permanently. This consists of insert the following lines to the same file as above <code>/home/pi/.config/lxsession/LXDE-pi/autostart</code> above the script execution line.</p> </div> </div> <div id="crayon-594bf2ad159a6696490626" class="crayon-syntax crayon-theme-arduino-ide crayon-font-monaco crayon-os-mac print-yes notranslate crayon-wrapped" data-settings=" minimize scroll-mouseover wrap"> <div class="crayon-toolbar" data-settings=" show"> <div class="crayon-tools"> <div class="crayon-button crayon-nums-button crayon-pressed" title="Toggle Line Numbers"><img data-attachment-id="5211" data-permalink="http://redpillanalytics.com/protected-kscope17-conference-analytics/screen-shot-2017-06-23-at-3-30-17-pm/" data-orig-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/Screen-Shot-2017-06-23-at-3.30.17-PM.png?fit=1346%2C168" data-orig-size="1346,168" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="Screen Shot 2017-06-23 at 3.30.17 PM" data-image-description="" data-medium-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/Screen-Shot-2017-06-23-at-3.30.17-PM.png?fit=300%2C37" data-large-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/Screen-Shot-2017-06-23-at-3.30.17-PM.png?fit=1024%2C128" class="alignnone size-full wp-image-5211" src="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/Screen-Shot-2017-06-23-at-3.30.17-PM.png?resize=1170%2C146" alt="" srcset="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/Screen-Shot-2017-06-23-at-3.30.17-PM.png?w=1346 1346w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/Screen-Shot-2017-06-23-at-3.30.17-PM.png?resize=300%2C37 300w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/Screen-Shot-2017-06-23-at-3.30.17-PM.png?resize=768%2C96 768w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/Screen-Shot-2017-06-23-at-3.30.17-PM.png?resize=1024%2C128 1024w" sizes="(max-width: 1170px) 100vw, 1170px" data-recalc-dims="1" /></div> <div class="crayon-button crayon-plain-button" title="Toggle Plain Code">And just like that the Pi functions as expected in a kiosk fashion. and boots directly into our Electron application, the cursor hides, and the screensaver is disabled.</div> </div> </div> </div> Emily Carlsen http://redpillanalytics.com/?p=5170 Sun Jun 25 2017 16:49:11 GMT-0400 (EDT) Kscope17 Conference Analytics Part 1: Hardware http://redpillanalytics.com/kscope17-analytics-hardware/ <p><img width="300" height="200" src="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?fit=300%2C200" class="attachment-medium size-medium wp-post-image" alt="" srcset="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?w=1400 1400w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?resize=300%2C200 300w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?resize=768%2C512 768w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?resize=1024%2C682 1024w" sizes="(max-width: 300px) 100vw, 300px" data-attachment-id="5213" data-permalink="http://redpillanalytics.com/protected-kscope17-conference-analytics/phone/" data-orig-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?fit=1400%2C933" data-orig-size="1400,933" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;1.8&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;Canon PowerShot G7 X&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;1496681464&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;8.8&quot;,&quot;iso&quot;:&quot;1000&quot;,&quot;shutter_speed&quot;:&quot;0.05&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;1&quot;}" data-image-title="phone" data-image-description="" data-medium-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?fit=300%2C200" data-large-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/phone.jpg?fit=1024%2C682" /></p><h3>Editors Note:</h3> <p>This year, Red Pill Analytics is the Analytics Sponsor at ODTUG Kscope17. Our company motto is #challengeeverything – so we knew we wanted to do something different and unexpected while at the conference.<br /> What we eventually landed on was creating Analytics Stations using IoT technologies to show how an old school object, like a rotary phone, can be repurposed and turned into an interactive device.<br /> <a href="http://redpillanalytics.com/kscope17-analytics-hardware/">Part 1 focuses on hardware.</a><br /> <a href="http://redpillanalytics.com/protected-kscope17-conference-analytics/">Part 2 focuses on software.</a><br /> Kscope17 also used beacon technology to analyze conference attendee activities. Red Pill Analytics pulled that information through a REST API and told the story of Kscope17 using Oracle Data Visualization. This will be explained in Part 3, coming soon!</p> <hr /> <p style="text-align: left;">Step 1 was to take apart the Cortelco touch tone phone (available on Amazon for ~$20 at time of writing) and see what the internals looked like and figure out how we could tap into the numberpad and telephone hook switch. Taking the phone apart was as simple as undoing three philips head screws that were holding the plastic molded top in place and revealing several phone jacks/wires, a bell, the number pad, a hook switch, and a single small circuit board acting as the phones intelligence. Next step was to simply unscrew circuit boards and remove unneeded components such as the bell, main circuit board, and unused telephone jacks&#8230; leaving the desired components, namely: the number pad, the side phone jack (that runs to the headset), and the vertical mount circuit board that contains the hook switch. See the images below:<img data-attachment-id="5217" data-permalink="http://redpillanalytics.com/kscope17-analytics-hardware/ec4/" data-orig-file="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC4.jpg?fit=5472%2C3648" data-orig-size="5472,3648" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;1.8&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;Canon PowerShot G7 X&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;1496682354&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;8.8&quot;,&quot;iso&quot;:&quot;640&quot;,&quot;shutter_speed&quot;:&quot;0.0333333333333&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;1&quot;}" data-image-title="EC4" data-image-description="" data-medium-file="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC4.jpg?fit=300%2C200" data-large-file="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC4.jpg?fit=1024%2C683" class="alignnone size-full wp-image-5217" src="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC4.jpg?resize=1170%2C780" alt="" srcset="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC4.jpg?w=5472 5472w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC4.jpg?resize=300%2C200 300w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC4.jpg?resize=768%2C512 768w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC4.jpg?resize=1024%2C683 1024w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC4.jpg?w=2340 2340w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC4.jpg?w=3510 3510w" sizes="(max-width: 1170px) 100vw, 1170px" data-recalc-dims="1" /><br /> Telephone internals</p> <div class="cols-wrapper cols-2"> <div class="col"> <p><img data-attachment-id="5218" data-permalink="http://redpillanalytics.com/kscope17-analytics-hardware/ec5/" data-orig-file="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC5.jpg?fit=5472%2C3648" data-orig-size="5472,3648" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;1.8&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;Canon PowerShot G7 X&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;1496683708&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;8.8&quot;,&quot;iso&quot;:&quot;400&quot;,&quot;shutter_speed&quot;:&quot;0.0333333333333&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;1&quot;}" data-image-title="EC5" data-image-description="" data-medium-file="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC5.jpg?fit=300%2C200" data-large-file="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC5.jpg?fit=1024%2C683" class="alignnone size-full wp-image-5218" src="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC5.jpg?resize=1170%2C780" alt="" srcset="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC5.jpg?w=5472 5472w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC5.jpg?resize=300%2C200 300w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC5.jpg?resize=768%2C512 768w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC5.jpg?resize=1024%2C683 1024w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC5.jpg?w=2340 2340w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC5.jpg?w=3510 3510w" sizes="(max-width: 1170px) 100vw, 1170px" data-recalc-dims="1" /></p> </div> <div class="col nomargin"> <p><img data-attachment-id="5219" data-permalink="http://redpillanalytics.com/kscope17-analytics-hardware/ec6/" data-orig-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC6.jpg?fit=5472%2C3648" data-orig-size="5472,3648" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;1.8&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;Canon PowerShot G7 X&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;1496681836&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;8.8&quot;,&quot;iso&quot;:&quot;800&quot;,&quot;shutter_speed&quot;:&quot;0.05&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;1&quot;}" data-image-title="EC6" data-image-description="" data-medium-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC6.jpg?fit=300%2C200" data-large-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC6.jpg?fit=1024%2C683" class="alignnone size-full wp-image-5219" src="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC6.jpg?resize=1170%2C780" alt="" srcset="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC6.jpg?w=5472 5472w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC6.jpg?resize=300%2C200 300w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC6.jpg?resize=768%2C512 768w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC6.jpg?resize=1024%2C683 1024w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC6.jpg?w=2340 2340w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC6.jpg?w=3510 3510w" sizes="(max-width: 1170px) 100vw, 1170px" data-recalc-dims="1" /></p> </div> </div> <div class="cols-wrapper cols-2"> <div class="col"> After removing phone internal</div> <div class="col nomargin"> <p>Unscrewing phone base from plastic molded top</p> </div> </div> <p><br class="clear" />Next is preparing the Raspberry Pi 3 which will be acting as the brain behind this embedded device (there is a software component to this and if that interests you, please visit part 2 where we will discussing setting up Raspbian Linux, interfacing with hardware from high level software, and configuring high level display and data code etc.) but for sake of this hardware write-up we will assume the Raspberry Pi software is already configured and will only focus on wiring/mounting of components etc. Namely amongst these mounting consideration is keeping the Raspberry Pi off the metal base of the phone as this would short out the device due to the exposed solder on the bottom of the Pi.</p> <p><img data-attachment-id="5225" data-permalink="http://redpillanalytics.com/kscope17-analytics-hardware/ec7/" data-orig-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC7.jpg?fit=5472%2C3648" data-orig-size="5472,3648" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;2.2&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;Canon PowerShot G7 X&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;1496685938&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;8.8&quot;,&quot;iso&quot;:&quot;250&quot;,&quot;shutter_speed&quot;:&quot;0.0333333333333&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;1&quot;}" data-image-title="EC7" data-image-description="" data-medium-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC7.jpg?fit=300%2C200" data-large-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC7.jpg?fit=1024%2C683" class="alignnone size-full wp-image-5225" src="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC7.jpg?resize=1170%2C780" alt="" srcset="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC7.jpg?w=5472 5472w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC7.jpg?resize=300%2C200 300w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC7.jpg?resize=768%2C512 768w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC7.jpg?resize=1024%2C683 1024w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC7.jpg?w=2340 2340w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC7.jpg?w=3510 3510w" sizes="(max-width: 1170px) 100vw, 1170px" data-recalc-dims="1" /><br /> 3D printing Pi enclosure</p> <p>The solution to this was to simply 3D print a plastic case for the Pi so that it could still be mounted firmly inside the phone easily and be fully accessible for wiring etc. See pictures below of the print in progress and how it looks once finished and mounted inside of the phone.</p> <p>&nbsp;</p> <div class="cols-wrapper cols-3"> <div class="col"> <p><img data-attachment-id="5226" data-permalink="http://redpillanalytics.com/kscope17-analytics-hardware/ec8/" data-orig-file="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC8.jpg?fit=5472%2C3648" data-orig-size="5472,3648" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;1.8&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;Canon PowerShot G7 X&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;1496751829&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;8.8&quot;,&quot;iso&quot;:&quot;2000&quot;,&quot;shutter_speed&quot;:&quot;0.05&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;1&quot;}" data-image-title="EC8" data-image-description="" data-medium-file="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC8.jpg?fit=300%2C200" data-large-file="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC8.jpg?fit=1024%2C683" class="alignnone size-full wp-image-5226" src="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC8.jpg?resize=1170%2C780" alt="" srcset="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC8.jpg?w=5472 5472w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC8.jpg?resize=300%2C200 300w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC8.jpg?resize=768%2C512 768w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC8.jpg?resize=1024%2C683 1024w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC8.jpg?w=2340 2340w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC8.jpg?w=3510 3510w" sizes="(max-width: 1170px) 100vw, 1170px" data-recalc-dims="1" /></p> </div> <div class="col"> <p><img data-attachment-id="5227" data-permalink="http://redpillanalytics.com/kscope17-analytics-hardware/ec9/" data-orig-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC9.jpg?fit=5472%2C3648" data-orig-size="5472,3648" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;1.8&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;Canon PowerShot G7 X&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;1496751950&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;8.8&quot;,&quot;iso&quot;:&quot;800&quot;,&quot;shutter_speed&quot;:&quot;0.05&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;1&quot;}" data-image-title="EC9" data-image-description="" data-medium-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC9.jpg?fit=300%2C200" data-large-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC9.jpg?fit=1024%2C683" class="alignnone size-full wp-image-5227" src="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC9.jpg?resize=1170%2C780" alt="" srcset="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC9.jpg?w=5472 5472w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC9.jpg?resize=300%2C200 300w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC9.jpg?resize=768%2C512 768w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC9.jpg?resize=1024%2C683 1024w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC9.jpg?w=2340 2340w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC9.jpg?w=3510 3510w" sizes="(max-width: 1170px) 100vw, 1170px" data-recalc-dims="1" /></p> </div> <div class="col nomargin"> <p><img data-attachment-id="5228" data-permalink="http://redpillanalytics.com/kscope17-analytics-hardware/ec10/" data-orig-file="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC10.jpg?fit=5472%2C3648" data-orig-size="5472,3648" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;1.8&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;Canon PowerShot G7 X&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;1496752029&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;8.8&quot;,&quot;iso&quot;:&quot;800&quot;,&quot;shutter_speed&quot;:&quot;0.05&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;1&quot;}" data-image-title="EC10" data-image-description="" data-medium-file="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC10.jpg?fit=300%2C200" data-large-file="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC10.jpg?fit=1024%2C683" class="alignnone size-full wp-image-5228" src="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC10.jpg?resize=1170%2C780" alt="" srcset="https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC10.jpg?w=5472 5472w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC10.jpg?resize=300%2C200 300w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC10.jpg?resize=768%2C512 768w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC10.jpg?resize=1024%2C683 1024w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC10.jpg?w=2340 2340w, https://i0.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC10.jpg?w=3510 3510w" sizes="(max-width: 1170px) 100vw, 1170px" data-recalc-dims="1" /></p> </div> </div> <div class="cols-wrapper cols-3"> <div class="col"> <p> Enclosure from side</p> </div> <div class="col"> <p>Pi sitting in enclosure</p> </div> <div class="col nomargin"> <p>Enclosure placed inside phone base</p> </div> </div> <p>With the Pi securely mounted in place in it’s plastic enclosure, the next step was to begin wiring the number pad, headset audio jack, and hook switch to dupont wires so they could easily be connected to the Pi’s 40 pin GPIO header. I simply stripped the wires, twisted them together, soldered them as needed, and applied a liquid electrical tape to each joint to prevent electrical shorts between wires. Lastly I simply plugged the female end of the dupont cables I used into the respective male header pins on the Pi. See below for pictures of the soldering process.</p> <div class="cols-wrapper cols-3"> <div class="col"> <p><img data-attachment-id="5230" data-permalink="http://redpillanalytics.com/kscope17-analytics-hardware/ec11/" data-orig-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC11.jpg?fit=5472%2C3648" data-orig-size="5472,3648" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;1.8&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;Canon PowerShot G7 X&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;1496843275&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;8.8&quot;,&quot;iso&quot;:&quot;800&quot;,&quot;shutter_speed&quot;:&quot;0.0333333333333&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;1&quot;}" data-image-title="EC11" data-image-description="" data-medium-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC11.jpg?fit=300%2C200" data-large-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC11.jpg?fit=1024%2C683" class="alignnone size-full wp-image-5230" src="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC11.jpg?resize=1170%2C780" alt="" srcset="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC11.jpg?w=5472 5472w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC11.jpg?resize=300%2C200 300w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC11.jpg?resize=768%2C512 768w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC11.jpg?resize=1024%2C683 1024w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC11.jpg?w=2340 2340w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC11.jpg?w=3510 3510w" sizes="(max-width: 1170px) 100vw, 1170px" data-recalc-dims="1" /></p> </div> <div class="col"> <p><img data-attachment-id="5231" data-permalink="http://redpillanalytics.com/kscope17-analytics-hardware/ec12/" data-orig-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC12.jpg?fit=5472%2C3648" data-orig-size="5472,3648" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;1.8&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;Canon PowerShot G7 X&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;1496843597&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;8.8&quot;,&quot;iso&quot;:&quot;500&quot;,&quot;shutter_speed&quot;:&quot;0.0333333333333&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;1&quot;}" data-image-title="EC12" data-image-description="" data-medium-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC12.jpg?fit=300%2C200" data-large-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC12.jpg?fit=1024%2C683" class="alignnone size-full wp-image-5231" src="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC12.jpg?resize=1170%2C780" alt="" srcset="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC12.jpg?w=5472 5472w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC12.jpg?resize=300%2C200 300w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC12.jpg?resize=768%2C512 768w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC12.jpg?resize=1024%2C683 1024w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC12.jpg?w=2340 2340w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC12.jpg?w=3510 3510w" sizes="(max-width: 1170px) 100vw, 1170px" data-recalc-dims="1" /></p> </div> <div class="col nomargin"> <p><img data-attachment-id="5232" data-permalink="http://redpillanalytics.com/kscope17-analytics-hardware/ec13/" data-orig-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC13.jpg?fit=5472%2C3648" data-orig-size="5472,3648" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;1.8&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;Canon PowerShot G7 X&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;1496849069&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;8.8&quot;,&quot;iso&quot;:&quot;800&quot;,&quot;shutter_speed&quot;:&quot;0.04&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;1&quot;}" data-image-title="EC13" data-image-description="" data-medium-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC13.jpg?fit=300%2C200" data-large-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC13.jpg?fit=1024%2C683" class="alignnone size-full wp-image-5232" src="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC13.jpg?resize=1170%2C780" alt="" srcset="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC13.jpg?w=5472 5472w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC13.jpg?resize=300%2C200 300w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC13.jpg?resize=768%2C512 768w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC13.jpg?resize=1024%2C683 1024w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC13.jpg?w=2340 2340w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC13.jpg?w=3510 3510w" sizes="(max-width: 1170px) 100vw, 1170px" data-recalc-dims="1" /></p> </div> </div> <div class="cols-wrapper cols-3"> <div class="col"> <p>After twisting and soldering wires together</p> </div> <div class="col"> <p>After applying white liquid electrical tape</p> </div> <div class="col nomargin"> <p> Dupont wires connected to Pi</p> </div> </div> <p>With all of the wiring in place, and components mounted the last thing to do was to attach a micro USB cable and an HDMI cable and put the black phone top back on and plug in the handset phone line. You can see a completed picture below!</p> <p><img data-attachment-id="5235" data-permalink="http://redpillanalytics.com/kscope17-analytics-hardware/ec14/" data-orig-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC14.jpg?fit=5472%2C3648" data-orig-size="5472,3648" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;1.8&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;Canon PowerShot G7 X&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;1496849805&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;8.8&quot;,&quot;iso&quot;:&quot;640&quot;,&quot;shutter_speed&quot;:&quot;0.0333333333333&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;1&quot;}" data-image-title="EC14" data-image-description="" data-medium-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC14.jpg?fit=300%2C200" data-large-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC14.jpg?fit=1024%2C683" class="alignnone size-full wp-image-5235" src="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC14.jpg?resize=1170%2C780" alt="" srcset="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC14.jpg?w=5472 5472w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC14.jpg?resize=300%2C200 300w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC14.jpg?resize=768%2C512 768w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC14.jpg?resize=1024%2C683 1024w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC14.jpg?w=2340 2340w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/EC14.jpg?w=3510 3510w" sizes="(max-width: 1170px) 100vw, 1170px" data-recalc-dims="1" /></p> Emily Carlsen http://redpillanalytics.com/?p=5184 Fri Jun 23 2017 13:01:36 GMT-0400 (EDT) ODTUG Announces the 2nd Annual GeekAThon http://www.odtug.com/p/bl/et/blogaid=736&source=1 SAVE THE DATE: ODTUG announces its 2nd annual GeekAThon! Get your *GEEK ON* and dazzle the community with your brilliant skills! ODTUG http://www.odtug.com/p/bl/et/blogaid=736&source=1 Fri Jun 23 2017 11:08:04 GMT-0400 (EDT) OBIEE 12c Catalog Validation: Command Line http://www.rittmanmead.com/blog/2017/06/obiee-catalog-validation-running-it-from-command-line/ <img src="http://www.rittmanmead.com/blog/content/images/2017/06/1q1dx1-2.jpg" alt="OBIEE 12c Catalog Validation: Command Line"><p>I wrote a <a href="https://www.rittmanmead.com/blog/2016/11/performing-a-catalog-validation/">blog post</a> a while ago describing the catalog validation: an automated process performing a consistency check of the catalog and reporting or deleting the inconsistent artifacts. <br> In the post I stated that catalog validation should be implemented regularly as part of the cleanup routines and provides precious additional information during the pre and post upgrade phases.</p> <p>However some time later I noted Oracle's support <a href="https://support.oracle.com/epmos/faces/DocumentDisplay?&amp;id=2199938.1">Doc ID 2199938.1</a> stating that the startup procedure I detailed in the previous blog post is not supported in any OBI release since 12.2.1.1.0. You can imagine my reaction...</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/05/1q1dx1.jpg" alt="OBIEE 12c Catalog Validation: Command Line"></p> <p>The question then became: How do we run the catalog validation since the known procedure is unsupported? The answer is in catalog manager and the related command line call <code>runcat.sh</code> which, in the server installations (like the <a href="http://www.oracle.com/technetwork/middleware/bi-foundation/obiee-samples-167534.html">SampleApp v607p</a>), can be found under <code>$DOMAIN_HOME/bitools/bin</code>.</p> <h1 id="howdoesitwork">How Does it Work?</h1> <p>As for most of command line tools, when you don't have a clue on how it works, the best approach is to run with the <code>-help</code> option which provides the list of parameters to pass.</p> <pre><code>Catalog Manager understands commands in the following areas: Development To Production createFolder Creates folder in the catalog delete Deletes the given path from the catalog maintenanceMode Puts the catalog into or out of Maintenance Mode (aka ReadOnly) ... Multi-Tenancy provisionTenant Provisions tenants into a web catalog ... Patch Management tag Tags all XML documents in a catalog with a unique id and common version string diff Compares two catalogs inject Injects a single item to a diff file ... Subject Area Management clearQueryCache Clears the query cache </code></pre> <p>Unfortunately none of the options in the list seems to be relevant for catalog validation, but with a close look at the recently updated <a href="https://support.oracle.com/epmos/faces/DocumentDisplay?&amp;id=2199938.1">Doc ID 2199938.1</a> I could find the parameter to pass: <code>validate</code>. <br> The full command then looks like </p> <pre><code>./runcat.sh -cmd validate </code></pre> <p>In my <a href="https://www.rittmanmead.com/blog/2016/11/performing-a-catalog-validation/">previous blog</a> I mentioned different types of validation. What type of validation is the default command going to implement? How can I change the behaviour? Again the <code>-help</code> option provides the list of instructions.</p> <pre><code># Command : -cmd validate -help validate Validates the catalog Description Validates the catalog For more information, please see the Oracle Business Intelligence Suite Enterprise Edition's Presentation Services Administration Guide. Syntax runcat.cmd/runcat.sh -cmd validate [ -items (None | Report | Clean) [ -links (None | Report | Clean) ] [-folder &lt;path{:path}&gt;] [-folderFromFile &lt;path of inclusion list file&gt;] ] [ -accounts (None | Report | Clean) [ -homes (None | Report | Clean) ] ] -offline &lt;path of catalog&gt; Basic Arguments None Optional Arguments -items (None | Report | Clean) Default is 'Report' -links (None | Report | Clean) Default is 'Clean'. Also, '-items' cannot be 'None'. -accounts (None | Report | Clean) Default is 'Clean' -homes (None | Report | Clean) Default is 'Report'. Also, '-accounts' cannot be 'None'. -folder &lt;path{:path}&gt; Which folders in the catalog to validate -folderFromFile &lt;path of inclusion list file&gt; File containing folders in the catalog to validate Common Arguments -offline &lt;path of catalog&gt; -folderFromFile &lt;folder from file&gt; ----- Sample Folder From File ------ /shared/groups/misc /shared/groups/_filters ------------------------------------ Example runcat.cmd/runcat.sh -cmd validate -offline c:\oraclebi\data\web\catalog\paint </code></pre> <p>Few bits to notice:</p> <ul> <li><strong>-offline</strong>: the catalog validation needs to happen offline. Either with services down or on a copy of the live catalog. Running catalog validation on a online catalog is dangerous especially with <em>"Clean"</em> options since could delete content in use.</li> <li><strong>-folder</strong>: the catalog validation can be run only for a subset of the catalog</li> <li><strong>None | Report | Clean</strong>: each validation can be skipped (None), logged (Report) or solved via removal of the inconsistent object (Clean)</li> <li><strong>Also, '-accounts' cannot be 'None'.</strong>: some validations are a prerequisite for others to happen</li> <li><strong>Default is 'Clean'</strong>: some validations have a <em>"Clean"</em> as default value, meaning that will solve the issue by removing the inconsistent object, this may be inappropriate in some cases.</li> </ul> <p>As written before, the initial catalog validation should be done with all options set on <em>Report</em> since this will give a log file of all inconsistencies without deleting pieces of the catalog that could still be valuable. In order to do so the command to execute is:</p> <pre><code>./runcat.sh -cmd validate -items Report -links Report -accounts Report -homes Report -offline &lt;path_to_catalog&gt; &gt; cat_validation.log </code></pre> <p><code>runcat.sh</code> output is displayed direcly in the console, I'm redirecting it to a file called <code>cat_validation.log</code> for further analysis.</p> <p>If, after the initial run with all options to <em>Report</em> you want the catalog validation utility to "fix" the inconsistent objects, just change the desired options to <em>Clean</em>. Please make sure to take a backup of the catalog before since the automatic fix is done by removing the related objects. Moreover ensure that catalog validation is working on a <strong>offline catalog</strong>. The command itself can work on top on a online catalog but is never a good idea checking a catalog that could potentially be changed while the tool is running.</p> <h1 id="theoutput">The output</h1> <p>Let's see few examples of how Catalog Validation spots inconsistent objects. For the purpose of this test I'll work with <a href="http://www.oracle.com/technetwork/middleware/bi-foundation/obiee-samples-167534.html">Oracle's Sampleapp</a>.</p> <h2 id="abandonedandinaccessiblehomes">Abandoned and inaccessible homes</h2> <p>Running the validation against the Sampleapp catalog provides some "interesting" results: some homes are declared "abandoned": this could be due to the related user not existing anymore in weblogic console, but that's not the case </p> <pre><code>E10 saw.security.validate.homes Abandoned home /users/weblogic </code></pre> <p>Looking deeper in the logs we can see that the same user folders are flagged as</p> <pre><code>User facing object '/users/weblogic' has no user permissions and is inaccessible </code></pre> <p>Logging in with the user <code>weblogic</code> doesn't allow me to check the "My Folders" in the catalog. When switching to <em>"Admin View"</em> and trying to open "My Folder" I get the following error</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/My-Folder-Error.png" alt="OBIEE 12c Catalog Validation: Command Line"></p> <p>As written in the logs looks like the user folder has permission problems. How can we solve this? One option is to use again the <code>runcat.sh</code> command with the <code>forgetAccounts</code> option to remove the inconsistent homes. However this solution deletes all the content related to the user that was stored under the "My Folders". </p> <p>In order to keep the content we need to overwrite the folder's permission with an administrator account. Unfortunately, when right-clicking on the folder, the "Permission" option is not available.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Properties-1.png" alt="OBIEE 12c Catalog Validation: Command Line"></p> <p>As a workaround I found that clicking on <code>Properties</code> and then on <code>Set Ownership of this item and all subitems</code> allows you to grant full access to the administrator which is then able to reset the proper user the relevant access privilege.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/change-ownership.png" alt="OBIEE 12c Catalog Validation: Command Line"></p> <p>Once the workaround is implemented the users is able to check his "My Folder" content, however the the errors are still present in catalog validation. The solution is storing the relevant artifacts in another part of the catalog, run <code>runcat.sh</code> with <code>forgetAccounts</code> option and then reimport the objects if needed.</p> <h2 id="inconsistentobjects">Inconsistent Objects</h2> <p>The main two reasons generating inconsistent objects are:</p> <ul> <li><strong>Invalid XML</strong>: The object (analysis or dashboard) XML code is not valid. This can be caused by errors during the write to disk or problems during migrations.</li> <li><strong>Broken Links</strong>: analysis contained in a dashboard or linked from other analysis have been renamed or deleted.</li> </ul> <p>Let's see how catalog validation shows the errors.</p> <h3 id="invalidxml">Invalid XML</h3> <p>To test this case I created a simple analysis with two columns and then went to the <em>Advanced</em> tab and deliberately removed an <code>&gt;</code> to make the XML invalid.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screwed-XML-1.png" alt="OBIEE 12c Catalog Validation: Command Line"></p> <p>When trying to applying the change I got the following error which denied me the possibility to save.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Error-XML.png" alt="OBIEE 12c Catalog Validation: Command Line"></p> <p>Since I really wanted to ruin my analysis I went directly to the file system under <code>$BI_HOME/bidata/service_instances/ssi/metadata/content/catalog/root/shared/$REQUEST_PATH</code> and changed the XML directly there.</p> <p>After than I run the catalog validation with only the flag <code>items</code> equal to <code>Report</code> and the rest set to <code>None</code> since I'm looking only at invalid XMLs. <br> The result as expected is:</p> <pre><code>Message: Unterminated start tag, 'saw:column', Entity publicId: /app/oracle/biee/user_projects/domains/bi/bidata/service_instances/ssi/metadata/content/catalog/root/shared/rm+demo/notworkinanalysis, Entity systemId: , Line number: 9, Column number: 13 </code></pre> <p>Which tells me that my analysis <code>notworkinganalysis</code> is invalid with an unterminated start tag, exactly the error I was expecting. Now I have two choices: either fixing the analysis XML manually or rerunning the catalog validation with option <code>Clean</code> which will delete the analysis since it's invalid. As said before there is no automated fix.</p> <p>I wanted to do a further example on this, instead of removing the <code>&gt;</code>, i removed a quotation mark <code>"</code> to make the analysis invalid</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screwed-Column.png" alt="OBIEE 12c Catalog Validation: Command Line"></p> <p>After clicking to <em>Apply</em> OBIEE already tells me that there is something wrong in the analysis. But since it allows me to save and since I feel masochist I saved the analysis.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Error-before-Saving.png" alt="OBIEE 12c Catalog Validation: Command Line"></p> <p>But... when running the catalog validation as before I end up seeing <strong>0 errors</strong> related to my <code>notworkinganalysis</code>. </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/1rb66j.jpg" alt="OBIEE 12c Catalog Validation: Command Line"></p> <p>The answer to Jackie Chan question is that I got 0 errors since in this second case the <strong>XML is still valid</strong>. Removing a <code>"</code> doesn't make the XML syntax invalid! In order to find and solve that error we would need to use Oracle's <a href="http://www.oracle.com/technetwork/middleware/bi/downloads/bi-bvt-download-3587672.html">Baseline Validation Tool</a>. </p> <h3 id="brokenlinks">Broken Links</h3> <p>To test the broken links case I created the following scenario:</p> <ul> <li>Analysis <code>SourceAnalysis</code> which has navigation action to <code>TargetAnalysis</code></li> </ul> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Interaction.png" alt="OBIEE 12c Catalog Validation: Command Line"></p> <ul> <li>Dashboard <code>TestDashboard</code> which contains the <code>TargetAnalysis</code> object.</li> </ul> <p>In order to break things I then deleted the <code>TargetAnalysis</code>.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Delete.png" alt="OBIEE 12c Catalog Validation: Command Line"></p> <p>Running catalog validation with the option <code>links</code> to <code>Report</code>. As expected I get a line</p> <pre><code>N1 saw.catalog.impl.scour.validateDeadLink Referenced path /shared/RM Demo/TargetAnalysis in file /shared/RM Demo/_portal/TestDashboard/page 1 is inaccessible. </code></pre> <p>But I don't get anything on the <code>SourceRequest</code> object, for which navigation is failing.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Error.png" alt="OBIEE 12c Catalog Validation: Command Line"></p> <p>But if instead of an action link I use <code>TargetAnalysis</code> to filter the results of <code>SourceAnalysis</code></p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Filter.png" alt="OBIEE 12c Catalog Validation: Command Line"></p> <p>And then delete <code>TargetAnalysis</code>, I get the expected error:</p> <pre><code>N1 saw.catalog.impl.scour.validateDeadLink Referenced path /shared/RM Demo/TargetAnalysis in file /shared/RM Demo/SourceAnalysis is inaccessible </code></pre> <p>Summarizing the <strong>broken link</strong> validation reports if missing objects are included in the main definition of other objects (as filters or as parts of dashboards) but doesn't seem to report if the missing object is only linked via an action.</p> <h1 id="conclusion">Conclusion</h1> <p>My experiments show that catalog validation finds some errors like invalid homes, XML files and broken links which otherwise users would hit at the run-time and that won't make them happy. But there are still some errors which it doesn't log like analysis with wrong column syntax, luckily for most of the cases other tools like the Baseline Validation can spot them easily so use all you have, use as frequently as possible and if you want more details about how it works and how it can be included in the automatic checks for code promotions don't hesitate to <a href="mailto:info+ftcv2@rittmanmead.com">contact us</a>!</p> Francesco Tisiot bc90cca5-52f9-4e0f-b131-a8f858f43e9e Fri Jun 23 2017 09:49:32 GMT-0400 (EDT) Installing Scala and Apache Spark on a Mac http://www.oralytics.com/2017/06/installing-scala-and-apache-spark-on-mac.html <p>The following outlines the steps I've followed to get get Scala and Apache Spark installed on my Mac. This allows me to play with Apache Spark on my laptop (single node) before deploying my code to a multi-node cluster.</p> <p><span style='text-decoration:underline;'><strong>1. Install Homebrew</strong></span></p><a href="https://brew.sh/">Homebrew</a> seems to be the standard for installing anything on a Mac. To install <a href="https://brew.sh/">Homebrew</a> run <pre><br />/usr/bin/ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"<br /></pre> <img src="https://lh3.googleusercontent.com/-hdiET4FNqg0/WUwCItXJKSI/AAAAAAAAMKg/c0F5VNIMZz4HD1_lFwjRCtqeYrskClNSACHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="485" height="417" /> <p>When prompted enter your system/OS password to allow the install to proceed.</p> <img src="https://lh3.googleusercontent.com/-TW1YftbqUUM/WUwCRirUhkI/AAAAAAAAMKk/QEBJjrhswe8DSL-10pBXPrBO2t4Z6mEoACHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="485" height="417" /> <img src="https://lh3.googleusercontent.com/-kDQ5zd_DdkI/WUwCepN678I/AAAAAAAAMKo/mRusshRNg3UImcAl_WPHEK2r-IWBy313ACHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="430" height="260" /> <p><span style='text-decoration:underline;'><strong>2. Install xcode-select (if needed)</strong></span></p><p>You may have xcode-select already installed. This tool allows you to install the languages using command line.</p> <pre><br />xcode-select --install<br /></pre> <p>If it already installed then nothing will happen and you will get the following message.</p> <pre><br />xcode-select: error: command line tools are already installed, use "Software Update" to install updates<br /></pre> <p><span style='text-decoration:underline;'><strong>3. Install Scala</strong></span></p><p>[If you haven't installed Java then you need to also do this.]</p><p>Use Homebrew to install scala.</p> <pre><br />brew install scala<br /></pre> <img src="https://lh3.googleusercontent.com/-HME9BbyStrI/WUwFicSMb5I/AAAAAAAAMK0/-cJg61U7VdgjH3WAkTX6LZK0FzfaG-UBgCHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="485" height="135" /> <p><span style='text-decoration:underline;'><strong>4. Install Apache Spark</strong></span></p><p>Now to install Apache Spark.</p> <pre><br />brew install apache-spark<br /></pre> <img src="https://lh3.googleusercontent.com/-h3Fl2TkKwaM/WUwGSG8vZaI/AAAAAAAAMK8/918sDICKEuk1zJSJY-srL-5-60aEwWjNACHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="457" height="63" /> <p><span style='text-decoration:underline;'><strong>5. Start Spark</strong></span></p><p>Now you can start the Apache Spark shell.</p> <pre><br />spark-shell<br /></pre> <img src="https://lh3.googleusercontent.com/-7pVDKlmECOE/WUwHLPSo8rI/AAAAAAAAMLE/lmTMmxVwqy02hHSkT1-ZUMbPGBTbKQbFQCHMYCw/NewImage.png?imgmax=800" alt="NewImage" title="NewImage.png" border="0" width="484" height="225" /> <p><span style='text-decoration:underline;'><strong>6. Hello-World and Reading a file</strong></span></p><p>The traditional Hello-World example.</p><pre><br />scala> val helloWorld = "Hello-World"<br />helloWorld: String = Hello-World<br /></pre><p>or</p><pre><br />scala> println("Hello World")<br />Hello World<br /></pre<br />><br /><p>What is my current working directory.</p><br /><pre><br />scala> val whereami = System.getProperty("user.dir")<br />whereami: String = /Users/brendan.tierney<br /></pre> <p>Read and process a file.</p> <pre><br />scala> val lines = sc.textFile("docker_ora_db.txt")<br />lines: org.apache.spark.rdd.RDD[String] = docker_ora_db.txt MapPartitionsRDD[3] at textFile at <console>:24<br /><br />scala> lines.count()<br />res6: Long = 36<br /><br />scala> lines.foreach(println)<br />####################################################################<br />## Specify the basic DB parameters<br />## Copyright(c) Oracle Corporation 1998,2016. All rights reserved.##<br />## ##<br />##------------------------------------------------------------------<br />## Docker OL7 db12c dat file ##<br /><br />## ##<br />## db sid (name)<br />####################################################################<br />## default : ORCL<br /><br />## cannot be longer than 8 characters<br />##------------------------------------------------------------------<br /><br />...<br /></pre> <br><p>There will be a lot more on how to use Spark and how to use Spark with Oracle (all their big data stuff) over the coming months.</p><br><p>[<em>I've been busy for the past few months working on this stuff, EU GDPR issues relating to machine learning, and other things. I'll be sharing some what I've been working on and learning in blog posts over the coming weeks</em>]</p> Brendan Tierney tag:blogger.com,1999:blog-4669933501315263808.post-1995446547230968416 Thu Jun 22 2017 15:17:00 GMT-0400 (EDT) Getting Smarter in Renting with Tableau 10 http://www.rittmanmead.com/blog/2017/06/getting-smarter-in-renting-with-tableau-10/ <h2 id="preface">Preface</h2> <p>Not a long time ago a friend of mine spent a significant amount of time trying to find a flat to rent. And according to what he said it wasn't an easy task. It took him a decent time and efforts to find something that is big enough (but not too big) not too far from a workplace, had required features and affordable at the same time. And as a specialist in data analysis, I prefer to think about this task as a data discovery one (yes, when you have a hammer everything looks like a nail). And I decided to see if a data analysis tool can help me understand the rental market better. I'm sure you've already read the name of this post so I can't pretend I'm keeping intrigue. This tool is Tableau 10.3.</p> <h2 id="thedata">The Data</h2> <p>The friend I was talking before was looking for a flat in Moscow, but I think that this market is completely unknown to the most of the readers. And also I'd have to spend a half of time translating everything into English so for this exercise I <a href="https://scrapy.org/">took</a> <a href="https://www.google.com/maps/place/Brighton,+UK">Brighton and Hove</a> data from <a href="http://www.rightmove.co.uk/property-to-rent/find.html?locationIdentifier=REGION^61480">http://rightmove.co.uk</a> and got a nice <a href="http://jsonlines.org/">JSON Lines</a> file. JSON Lines files are basically the same JSON as we all know but every file has multiple JSONs delimited by a newline. </p> <pre><code>{json line #1} {json line #2} ... {json line #n} </code></pre> <p>That could be a real problem but luckily Tableau introduced <a href="https://www.tableau.com/new-features/10.1#tab-data-1">JSON support in Tableau 10.1</a> and that means I don't have to transform my data to a set of flat tables. Thanks to Tableau developers we may simply open JSON Lines files without any transformations.</p> <p>Typical property description looks like this:</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/10-2.png" alt="10"></p> <p>It has a few major blocks:</p> <ul> <li><strong>Property name</strong> - 2 bedroom apartment to rent</li> <li><strong>Monthly price</strong> - £1,250</li> <li><strong>Description</strong> tab: <ul><li><strong>Letting information</strong> - this part is more or less standard and has only a small number of possible values. This part has <code>Property name</code>: <code>Property value</code> structure ('Date available':'Now').</li> <li><strong>Key features</strong> - this part is an unformalized set of features. Every property may have its own unique features. And it is not a key-value list like <strong>Letting information</strong>, but a simple list of features.</li> <li><strong>Full description</strong> - simply a block of unstructured text.</li></ul></li> <li><strong>Nearest stations</strong> - shows three nearest train stations (there could be underground stations too if they had it in Brighton).</li> <li><strong>School checker</strong> - this shows 10 closest primary and 10 secondary schools. For this, I found a kind of API which brought me a detailed description of every school.</li> </ul> <p>And finally, the JSON for one property has the following look. In reality, it is one line but just to make it more easy to read I formatted it to a human readable format. And also I deleted most of the schools' info as it is not as important as it is huge.</p> <p><details> <br> <summary>Property JSON</summary> </details></p> <pre><code>{ "furnish":"Unfurnished", "key_features":[ "LARGE BRIGHT SPACIOUS LOUNGE WITH PATIO DOORS", "FULLY FITTED KITCHEN", "TWO DOUBLE BEDROOMS WITH WARDROBES", "A FURTHER SINGLE BEDROOM/OFFICE/STUDY", "A GOOD SIZED SHOWER ROOM ", "SINGLE GARAGE AND ON STREET PARKING", "EASY ACCESS TO THE CITY CENTRE OF CHICHESTER AND COMMUTER ROUTES. ", "TO ARRANGE A VIEWING PLEASE CONTACT US ON 01243 839149" ], "property_price_week":"£254 pw", "nearest_stations":[ { "station_name":"Fishbourne", "station_dist":"(0.4 mi)" }, { "station_name":"Chichester", "station_dist":"(1.2 mi)" }, { "station_name":"Bosham", "station_dist":"(1.7 mi)" } ], "letting_type":"Long term", "secondary_schools":{ "schools":[ { "distance":"0.6 miles", "ukCountryCode":"ENG", "name":"Bishop Luffa School, Chichester", ... }] } "url":"http://www.rightmove.co.uk/property-to-rent/property-66941567.html", "date_available":"Now", "date_reduced":"", "agent":"On The Move, South", "full_description":"&lt;p itemprop=\"description\"&gt;We are delighted to bring to market, this fabulous semi detached bungalow ... &lt;/p&gt;", "primary_schools":{ "schools":[ { "distance":"0.3 miles", "ukCountryCode":"ENG", "name":"Fishbourne CofE Primary School", }] } }, "property_address":[ "Mill Close, Chichester, West Sussex, PO19"], "property_name":"3 bedroom bungalow to rent", "date_added":"08 June 2017 (18 hours ago)", "property_price_month":"£1,100 pcm", "let_agreed":null, "unknownown_values":"", "deposit":"£1384" } </code></pre> <p></p> <p>The full version is here: <a href="https://gist.github.com/andrew-fomin/73605314c730c6f4f7157ca2a2fe6be9">6391 lines, I warned you</a>. My dataset is relatively small and has <strong>1114</strong> of such records <strong>117 MB</strong> in total.</p> <p>Just a few things I'd like to highlight. <strong>Letting information</strong> has only a small number of fixed unique options. I managed to parse them to fields like <code>furnish</code>, <code>letting_type</code>, etc. <strong>Key Features</strong> list became just an array. We have thousands of various features here and I can't put them to separate fields. <strong>Nearest stations</strong> list became an array of name and value pairs. My first version of the scrapper put them to a key-value list. Like this:</p> <pre><code>"nearest_stations":[ "Fishbourne": "(0.4 mi)", "Chichester": "(1.2 mi)", "Bosham": "(1.7 mi)" ] </code></pre> <p>but this didn't work as intended. I got around one hundred of measures with names <em>Fishbourne</em>, <em>Chichester</em>, <em>Bosham</em>, etc. Not what I need. But that could work well if I had only a small number of important POIs (airports for example) and wanted to know distances to this points. So I changed it to this and it worked well:</p> <pre><code>"nearest_stations":[ { "station_name":"Fishbourne", "station_dist":"(0.4 mi)" }, { "station_name":"Chichester", "station_dist":"(1.2 mi)" }, { "station_name":"Bosham", "station_dist":"(1.7 mi)" } ] </code></pre> <h2 id="connecttothedata">Connect to the Data</h2> <p>When I started this study my knowledge of the UK property rent market was close to this:</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/20-1.jpg" alt="20"></p> <p>And it's possible or even likely that some of <em>my conclusions</em> may be obvious for anyone who is deep in the topic. In this blog, I show how a complete newbie (me) can use Tableau and become less ignorant.</p> <p>So my very first task was to understand what kind of objects are available for rent, what are their prices and so on. That is the typical task for any new subject area.</p> <p>As I said before Tableau 10 can work with JSON files natively but the question was if it could work with such a complex JSON as I had. I started a new project and opened my JSON file.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/30-1.png" alt="30"></p> <p>I expected that I will have to somehow simplify it. But in reality after a few seconds of waiting Tableau displayed a full structure of my JSON and all I had to do was selecting branches I need.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/40-1.gif" alt="40"></p> <p>After a few more seconds I got a normal Tableau data source.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/50-1.png" alt="50"></p> <p>And this is how it looked like in analysis mode</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/55.png" alt="55"></p> <h2 id="firstlookatthedata">First Look at the Data</h2> <p>OK, let's get started. The first question is obvious: "What types of property are available for rent?". Well, it seems that <code>name</code> ('2 bedroom apartment to rent') is what I need. I created a table report for this field.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/60-1.png" alt="60"></p> <p>Well, it gives me the first impression of what objects are offered and what my next step should be. First of all the names are ending with "to rent". This just makes strings longer without adding any value. The word "bedroom" also doesn't look important. Ideally, I'd like to parse these strings into fields one of which is <code># of bedrooms</code> and the second one is <code>Property type</code>. The most obvious action is to try <a href="https://onlinehelp.tableau.com/current/pro/desktop/en-us/split.html"><code>Split</code></a> function.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/80-1.png" alt="80"></p> <p>Well, it partially worked. This function is smart enough and removed 'to rent' part. But except for this, it gave me nothing. On other datasets (other cities) it gave me much better results but it still wasn't able to read my mind and did what I wanted:</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/85.png" alt="85"></p> <p>But I spent 15 seconds for this and lost nothing and if it worked I'd saved a lot of time. Anyway, I'm too old to believe in magic and this almost didn't hurt my feelings.</p> <blockquote> <p>Some people, when confronted with a problem, think “I know, I'll use regular expressions.” Now they have two problems.</p> </blockquote> <p>Yes, this string literally asks some regular expressions wizardry.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/90-1.png" alt="90"></p> <p>I can easily use <a href="https://onlinehelp.tableau.com/current/pro/desktop/en-us/functions_functions_additional.html"><code>REGEXP_EXTRACT_NTH</code></a> and get what I want. Group 1 is the number of bedrooms and Group 3 is the property type. Groups 2 and 4 are just constant words.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/100-1.png" alt="100"></p> <p><details><summary>Explanation for my regular expression</summary>I can describe most of the names in the following way: "<code>digit</code> <strong>bedroom</strong> <code>property type</code> <strong>to rent</strong>" and the rest are "<code>property type</code> <strong>to rent</strong>. So <code>digit</code> and <strong>bedroom</strong> are optional and <code>property type</code> <strong>to rent</strong> are mandatory. The expression is easy and obvious: <code>([0-9]*)( bedroom )*(.*)( to rent)</code> <br> </details></p> <p>Regular expressions are one of my favourite hammers and helped me a lot for this analysis. And after all manipulations, I got a much better view of the data (I skipped some obvious steps like create a crosstab or a count distinct measure to save space for anything more interesting).</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/110-1.png" alt="110"></p> <p>And while this result looks pretty simple it gives me the first insight I can't get simply browsing the site. <em>The most offered are 1 and 2 bedroom properties especially flats and apartments. And if a family needs a bigger something with 4 or 5 bedrooms, well I wish them good luck, not many offers to chose from.</em> Also if we talk about living property only we should <a href="https://onlinehelp.tableau.com/current/pro/desktop/en-us/filtering_datasource.html">filter out</a> things like <strong>GARAGE</strong>, <strong>PARKING</strong> or <strong>LAND</strong>.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/120-1.png" alt="120"> <img src="http://www.rittmanmead.com/blog/content/images/2017/06/130-1.png" alt="130"></p> <p>I think both charts work pretty well. The first one presents a nice view of how flats and apartments outnumber all other types and the second one gives a much better understanding of how many of 2 bedroom properties offered compared to all others.</p> <p>And while I'm not a big fan of fancy visualisations but if you need something less formal and more eye-catching try <code>Bubbles chart</code>. It's not something I'd recommend for an analysis but may work well for a presentation. Every bubble represents particular property type, colour shows a number of bedrooms and size shows the number of properties.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/140-1.png" alt="140"></p> <h2 id="goingdeeper">Going Deeper</h2> <p>The next obvious question is the price. How much do different properties cost? Is any particular one more expensive than average or less? What influences the price?</p> <p>As a baseline, I'd like to know what is the average property price. And I obviously don't want just one figure for the city-wide price. It's meaningless. Let's start with a bar chart and see what is the range of prices.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/145-1.png" alt="145"></p> <p>Well, we have a lot of options. <em><strong>Flat share</strong> costs less than £700 or we may choose a barn for more than £3600.</em> Again a very simple result but I can't get it directly from the site.</p> <p>The next obvious question is how the number of bedrooms affects the price. Does the price skyrockets with every additional bedroom or maybe more bedrooms mean <em>smaller rooms</em> and price increases not too fast?</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/150-1.png" alt="150"></p> <p>Well, this chart gives me the answer but it looks bad. Mostly because a lot of properties types don't have enough variance in room number. Studio flats have only one bedroom by definition and the only converted barn has 7 bedrooms. I'd like to remove types which don't have at least 3 options and see how the price changes. For this, I created a new computed field using <code>fixed</code> keyword. It counts the number of bedroom options by property type.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/160-1.png" alt="160"></p> <p>And then I use it in the filter <code>'Bedroom # variance' at least 3</code>. Now I have a much more clean view. And I can see that <em>typically more bedrooms mean significantly higher price</em> with a few exceptions. But in fact, these are not actual exceptions just a problem of a small dataset. I can say that <em>increase in # bedrooms certainly means a significant increase in price.</em> And one more insight. <em>Going above 7 bedrooms may actually double the price.</em></p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/170-1.png" alt="170"></p> <p>Averages are good but they hide important information of how prices are distributed. For example, six properties priced £1K and one £200 give average £885. And looking at average only may make you think that with £900 you may choose one of 7 options. It's very easy to build a chart to check this. Just create a new calculation called <a href="http://onlinehelp.tableau.com/current/pro/desktop/en-us/calculations_bins.html"><code>Bins</code></a> and use in a chart.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/180-1.png" alt="180"> <img src="http://www.rittmanmead.com/blog/content/images/2017/06/190-1.png" alt="190"></p> <p>With £100 bins I got the following chart. It shows how many properties have price falling to a particular price range. For example, the £1000 bin shows # of properties with prices £1000-£1100.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/200-1.png" alt="200"></p> <p>The distribution looks more or less as expected but the most interesting here is that £1000-£1100 interval seems to be very unpopular. Why? Let's add <code># of bedrooms</code> to this chart.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/210-1.png" alt="210"></p> <p>£1000 is too expensive for 1 bedroom and studios but too cheap for two. Simple. What else can we do here before moving further? Converting this chart to a running total gives a cool view.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/220-1.png" alt="220"></p> <p>What can this chart tell us? For example, if we look at the orange line (2 bedrooms) we will find that with £1200 we may choose among 277 of 624 properties. With £1400 budget we have 486 of 624. Further £200 increase in budget won't significantly increase the number of possibilities and if the change from £1200 to £1400 almost doubled the number of possibilities, the next £200 give only 63 new options. I don't have a ready to use insight here, but I got a way to estimate a budget for a particular type of property. <em>With budget £X I will be able to choose one of N properties.</em></p> <h2 id="whyitcostswhatitcosts">Why It Costs What It Costs</h2> <p>OK, now I know a lot of statistics about prices. And my next question is about factors affecting the price. I'd like to understand does a particular property worth what it cost or not. Of course, I won't be able to determine exact price but even hints may be useful.</p> <p>The first hypothesis I want to check is if a train station near raises the price or it isn't any important. I made a chart very similar to the previous one and it seems that <a href="https://en.wikipedia.org/wiki/Pareto_principle">Pareto principle</a> works perfectly here. 80% or properties are closer than 20% of the maximum distance to a station.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/230-1.png" alt="230"></p> <p>But this chart doesn't say anything about the price it just gives me the understanding of how dense train stations are placed. I'd say that most of the properties have a station in 10-15 minutes of walking reach and therefore this should not significantly affect the price. My next chart is a scatter plot for price and distance. Every point is a property and its coordinates on the plot determined by its price and distance to the nearest station. Colour shows <code># of bedrooms</code>.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/240.png" alt="240"></p> <p>I'd say that this chart shows no clear correlation between price and distance. And a more classical line chart shows that.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/250.png" alt="250"></p> <p>The maximum price slightly decreases with distance, minimum price on the contrary increases. Average price more or less constant. I think the hypothesis is busted. <em>There is no clear correlation between the distance a tenant have to walk to a station and the price he has to pay.</em> If you want to rent something and the landlord says that the price is high because of a train station near, tell him that there are stations all around and he should find something more interesting.</p> <p>What about furnishings? Does it cheaper to get an unfurnished property or a landlord will be happy to meet someone who shares his taste?</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/260.png" alt="260"></p> <p>Unfurnished property is definitely cheaper. And it's interesting that in some cases partly furnished even cheaper than completely unfurnished. But at least for furnished/unfurnished, we can see a clear correlation. <em>When you see a furnished one for the price of unfurnished this may be a good pennyworth.</em></p> <p>Another thing I'd like to check. Can we expect I lower price for a property not available immediately? Or is, on the contrary, the best price is offered for already unoccupied properties?</p> <p>As always start with a general picture. What is the average time of availability by property types?</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/270.png" alt="270"></p> <p>For most popular types it is about one month and if you have a house you typically publish it two or three months in advance. And what is about the price? One more chart that I like in Tableau. In the nutshell, it is a normal line chart showing an average price by days before property availability. But the thickness of lines shows the number of properties at the same time. So I can see not only the price but reliance too. A thick line means it was formed by many properties and a thin line may be formed by few properties and move up or down significantly then something changes. It would be very interesting to get a historical data and see how much time properties stay free or how long it takes before the price is reduced, but unfortunately, I don't have this data.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/280.png" alt="280"></p> <p>And looking at this chart I'd say that <em>there is no statistically significant dependency for price and availability date. Renting a property available in the distant future won't save you money</em><sup><strong>*</strong></sup> (<sup><strong>*</strong></sup>=statistically).</p> <p>And the last thing I'd like to investigate is the <strong>Key features</strong>. What do landlords put as the key features of their properties? How do they affect the price?</p> <p>The list of popular Key features surprised me.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/290.png" alt="290"></p> <p>'Unfurnished' looks good to me, it is a really significant part of the deal. But 'Brighton'? For properties in Brighton? '1 Bedroom'. <em>How many bedrooms can '1 bedroom flat to rent' have? Oh, there is a key feature saying '1 bedroom' now I know.</em> But jokes aside. I had to make a lot of cleaning on this data before I could use it. There are six ways to write 'Modern kitchen'. Make everything upper case, then remove quotes, strip spaces and tabs, remove noisy features like 'stylish 1 bedroom apartment' and so on. After this, I got a slightly better list with approximately 3500 features instead of 4500. Note how all variants of writing 'GAS CENTRAL HEATING' now combined into one most popular feature. But there are still too many features. I'm sure that there should be not more than a hundred of them. Even at this screenshot you may see 'Unfurnished' and 'Unfurnished property' features.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/300.png" alt="300"></p> <p>When I need a visualisation for this amount of points, bar charts or tables won't play well. My weapon of choice is <strong>Scatter plot</strong>. Every point is a particular feature, axes are minimum and average prices of it, size is determined by the number of properties declaring to have this feature and the colour is the maximum price. So if a feature is located high on the plot it means that in average it will be expensive to have it. If this feature at the same time located close to the left side even cheap properties may have it. For example, if you want a <strong>swimming pool</strong> be ready to pay at least £3000 and £7000 in average. And the minimum price for <strong>tumble dryer</strong> is £3250 but average £3965. The cheapest property with a dryer is more expensive than with a pool, but in average pools are more expensive. That is how this chart works.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/310.png" alt="310"></p> <p>The problems of this chart are obvious. It is littered with unique features. Only one property has <strong>4 acres</strong> (the point in top right corner). And actually not so many swimming pools are available for rent in Brighton. I filtered it by "# of properties > 25" and here is how prices for the most popular features are distributed.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/320.png" alt="320"></p> <p><strong>Central location</strong> will cost you at least £100 and £1195 in average and for <strong>Great location</strong> be ready to pay at least £445 and £1013 in average. Great location seems to be less valuable than the central one.</p> <p>And now I can see how a particular feature impacts prices. For example 'GAS HEATING'. I made a <a href="http://onlinehelp.tableau.com/current/pro/desktop/en-us/sortgroup_sets.html"><strong>set</strong></a> with all variants of heating I could find ('GAS CENTRAL HEATING', 'GAS HEAT' and so on). Now I can analyse how this feature impacts properties. And here is how it impacts the price of flats. Blue circles are properties with gas heating and orange are without.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/330.png" alt="330"></p> <p>Very interesting in my opinion. <em>The minimum price of properties with gas heating (blue circles) is higher than without.</em> That is expected. But average price for properties without gas heating is higher.</p> <p>And here are kitchen appliances. <em>For 1 bedroom flats, they increase both minimum and average prices significantly. But for bigger flats minimum price with appliances is higher and average price is lower.</em> Possible this option is important for relatively cheap properties, but its weight is not that big for the bigger ones.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/340.png" alt="340"></p> <h2 id="summary">Summary</h2> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/350-1.png" alt="350"></p> Andrew Fomin f6d594b4-2639-4d0d-94bc-e0da639374c5 Thu Jun 22 2017 04:26:00 GMT-0400 (EDT) Oracle 12c Release 2 New Feature Online and Automatic Tablespace Encryption https://gavinsoorma.com/2017/06/oracle-12c-release-2-new-feature-online-and-automatic-tablespace-encryption/ <p>In Oracle 12c Release 2, <strong>tablespaces can now be encrypted while they are online and in read-write mode</strong>.  In earlier releases, the tablespace had to be taken offline first or the database had to be in the mount state and not open.</p> <p>In addition, now in 12.2 the SYSTEM, </p><div class="mgm_private_no_access"><div style="border-style:solid; border-width:1px; margin-bottom:1em; background-color:#E4F2FD; border-color:#C6D9E9; margin:5px; font-family:'Lucida Grande','Lucida Sans Unicode',Tahoma,Verdana,sans-serif; font-size:13px; color:#333333;"> <div style="margin: 5px 10px;">You need to be logged in to see this part of the content. Please <a href="https://gavinsoorma.com/login/?redirect_to=https://gavinsoorma.com/2017/06/oracle-12c-release-2-new-feature-online-and-automatic-tablespace-encryption/"><b>Login</b></a> to access. </div> </div></div> Gavin Soorma https://gavinsoorma.com/?p=7672 Thu Jun 22 2017 00:28:23 GMT-0400 (EDT) Oracle 12c Release 2 Multitenancy New Features https://gavinsoorma.com/2017/06/oracle-12c-release-2-multitenancy-new-features/ <p>This note describes the new features related to the Oracle Database Multitenant option which have been introduced in Oracle 12c Release 2 (12.2.0.1).</p> <p>The hands-on exercises included in the note demonstrates the use of the following new multitenancy features:</p> <ul> <li>Hot Cloning</li> <li>Refreshable Pluggable Databases</li> <li>Support for pluggable databases with multiple </li></ul><div class="mgm_private_no_access"><div style="border-style:solid; border-width:1px; margin-bottom:1em; background-color:#E4F2FD; border-color:#C6D9E9; margin:5px; font-family:'Lucida Grande','Lucida Sans Unicode',Tahoma,Verdana,sans-serif; font-size:13px; color:#333333;"> <div style="margin: 5px 10px;">You need to be logged in to see this part of the content. Please <a href="https://gavinsoorma.com/login/?redirect_to=https://gavinsoorma.com/2017/06/oracle-12c-release-2-multitenancy-new-features/"><b>Login</b></a> to access. </div> </div></div> Gavin Soorma https://gavinsoorma.com/?p=7666 Wed Jun 21 2017 11:36:41 GMT-0400 (EDT) Rittman Mead at Kscope 2017 http://www.rittmanmead.com/blog/2017/06/rittman-mead-at-kscope-2017/ <img src="http://www.rittmanmead.com/blog/content/images/2017/06/Texas-logo.png" alt="Rittman Mead at Kscope 2017"><p>Rittman Mead will be well represented in San Antonio, Texas next week for Kscope 17 with some of our best from both sides of the Atlantic! Our very own Francesco Tisiot and Jordan Meyer will present various topics as well as participate in the conference events. Also, the newly named ODTUG BI Community Lead, Rittman Mead's Becky Wagner, will be on hand and leading a lot of activities throughout. See details below and we hope to see you in Texas.</p> <h2 id="jordan"><strong>Jordan</strong></h2> <p>Oracle Big Data Spatial and Graph enables the analysis of data sets beyond that of standard relational analytics commonly used. Through graph technology relationships can be identified that may not otherwise have been. This has practical uses including in product recommendations, social network analysis, and fraud detection.</p> <p>In this presentation we will see a practical demonstration of Oracle Big Data Spatial and Graph to load and analyze the "Panama Papers" data set. Graph algorithms will be utilized to identify key actors and organizations within the data, and patterns of relationships shown. This practical example of using the tool will give attendees a clear idea of the functionality of the tool and how it could be used within their own organization.</p> <p>When: Jun 27, 2017, Tuesday Session 7 , 11:15 am - 12:15 pm <br> Room: Magnolia</p> <h2 id="francesco"><strong>Francesco</strong></h2> <p>OBIEE 12c is the latest generation of Oracle's Enterprise analytics and reporting tool, bringing with it many powerful new features. Many users are still on earlier releases of OBIEE 11g or even 10g, and are looking to understand how they can move to OBIEE 12c to benefit from its new capabilities. </p> <p>Liberty Global is a global telecommunications company, with a long history with OBIEE going back to 10g. They wanted to move to OBIEE 12c in order to use the new Advanced Analytics options, and used Rittman Mead to support them with the full scope of the upgrade.</p> <p>In this presentation, we will see what a highly successful OBIEE 12c migration looks like. We will cover clear details of all the steps required, and discuss some of the problems encountered. Regression testing is a crucial step in any upgrade and we will show how we did this efficiently and accurately with the provided Baseline Validation Tool. This presentation will assist all attendees who are considering, or in the process of, an OBIEE 12c upgrade.</p> <p>When: Jun 26, 2017, Monday Session 5 , 4:45 pm - 5:45 pm <br> Room: Wisteria/Sunflower</p> <p><em>And</em></p> <p>As a DBA or sysadmin responsible for OBIEE how do you really dig into the guts of OBIEE, look at intra-component communication between the system components and examine the apparently un-examinable? What do you do when you need to trace activity beyond what is in the log files? How do you work with log files in order to give precise but low-level information? What information can be gleaned, by hook or by crook, from OBIEE?</p> <p>OBIEE provides a set of systems management and diagnostic tools, but these only take you so far. Join me in this presentation to dive deeper with OBIEE. We will take a look at a bag of tricks including undocumented configuration options, flame graphs, system call tracing, discovering undocumented REST APIs, and more! This is not <em>just</em> a geek-out - this is real-life examples of where client OBIEE projects have required that next level of diagnostic techniques and tools. Don your beanie hat and beard as we go deep!</p> <p>When: Jun 28, 2017, Wednesday Session 12 , 9:45 am - 10:45 am <br> Room: Wisteria/Sunflower</p> <h2 id="becky"><strong>Becky</strong></h2> <p>Becky Wagner is the new ODTUG BI Community Lead. You will find her at:</p> <p>Monday Community Lunch | 12:45 – 2:00 PM | Grand Oaks K-S</p> <p>Monday evening BI Community Night | 8:00 - 10:00 PM | Grand Oaks H <a href="http://kscope17.com/events/community-nigh-events">http://kscope17.com/events/community-nigh-events</a></p> <p>She will be doing the 5K Fun Run <a href="http://kscope17.com/events/kscope17-5k">http://kscope17.com/events/kscope17-5k</a> on Tuesday morning </p> <p>Women in Technology Lunch | 12:15– 1:45 PM | Cibolo Canyon 6 on Wednesday <a href="https://form.jotformpro.com/71134693041955">https://form.jotformpro.com/71134693041955</a></p> <p>Navigating the Oracle Business Analytics Frontier Panel <br> 9:00 AM - 11:00 AM, Cibolo Canyon 8/9/10 <br> <a href="http://kscope17.com/content/thursday-deep-dive-sessions">http://kscope17.com/content/thursday-deep-dive-sessions</a></p> Jason Davis 5af13172-b5b3-465d-b738-58fe494f114f Wed Jun 21 2017 08:45:00 GMT-0400 (EDT) Oracle BI Cloud Service (BICS) Access Options: Remote Data Connector (RDC) Overview and Configuration http://blog.performancearchitects.com/wp/2017/06/21/oracle-bi-cloud-service-bics-access-options-remote-data-connector-rdc-overview-and-configuration/ <p>Author: Doug Ross, Performance Architects</p> <p>As more organizations move their business intelligence (BI) environments to the cloud, loading and accessing enterprise data will become as important as the front-end visualizations.  <a href="https://cloud.oracle.com/business_intelligence">Oracle&#8217;s BI Cloud Service (BICS)</a> offers several options for those data requirements that go beyond simple data upload. Each has a specific purpose, features, benefits, and limitations. Only one option allows data to remain on-premise for querying by BICS: <a href="http://www.oracle.com/technetwork/middleware/bicloud/downloads/index.html">Remote Data Connector (RDC)</a>.</p> <p>Rather than moving data to the cloud, RDC enables a secure connection to on-premise data sources for analysis and visualization. RDC utilizes the BI Server Data Gateway running in the BICS environment to provide secure access to on-premise data using private/public key pairs and SSL communication.  The primary benefit of RDC is that it preserves the investment in the technology used to house and load on-premise data warehouses.  It offers a hybrid approach to transitioning to a cloud-based analytics environment without having to also migrate the entire data environment as well.</p> <p>RDC enables analyses in BICS to connect directly to an <a href="https://www.oracle.com/database/index.html">on-premise Oracle database</a> following proper configuration of the on-premise firewall, security, and WebLogic installation.  When an analysis is executed in BICS, a SQL request is generated and transmitted to the on-premise WebLogic server.  WebLogic passes that SQL onto the associated database, compresses the resulting dataset, and then returns that result to BICS where it is presented in a visualization view.</p> <p>This provides organizations with very large on-premise data warehouses the ability to use BICS as a front end without having to duplicate the same data in the cloud.</p> <p>A Remote Data Connector setup requires the following mandatory components:</p> <ul> <li>The <a href="http://www.oracle.com/technetwork/middleware/bi-enterprise-edition/overview/index.html">Oracle Business Intelligence Enterprise Edition (OBIEE)</a> BI Administration client tool that is used to create the RPD must be version 12.2.1.0.0 only. This is due to the RDC requiring the JDBC (JNDI) Data Source option for the connection to work. The configured RPD will be “lifted-and-shifted” to the BICS environment while maintaining RDC connections at the physical layer</li> <li>The on-premise database can be Oracle, Teradata, SQL Server, or DB2</li> <li>The on-premise environment must have either a configured WebLogic server or Apache Tomcat server. While prior versions of WebLogic should work, the latest version would be preferred</li> <li>The on-premise WebLogic server must be accessible externally via the necessary networking, security and firewalls configuration. In the RPD, the port defined in the physical connection must accurately route to the WebLogic server port</li> </ul> <p>One important item to understand is that once a “lift-and-shift” of the RPD is performed in the BICS environment, any previous connections to the co-located database ‘Schema Service’ will not be accessible.  The on-premise RPD data model will replace the Schema Service repository and will not be able to connect to Schema Service database objects.</p> <p>The following diagram produced by Oracle illustrates how the RDC environment works:</p> <p><a href="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/05/doug1.png"><img class="alignnone size-medium wp-image-2032" src="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/05/doug1-300x177.png" alt="" width="300" height="177" srcset="http://blog.performancearchitects.com/wp/wp-content/uploads/2017/05/doug1-300x177.png 300w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/05/doug1-768x454.png 768w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/05/doug1-624x369.png 624w, http://blog.performancearchitects.com/wp/wp-content/uploads/2017/05/doug1.png 841w" sizes="(max-width: 300px) 100vw, 300px" /></a></p> <p>Consider the following before implementing an RDC solution:</p> <p>Network performance will become a much greater factor in the execution time of analyses and visualizations.   Large query result sets transferred over the network will likely introduce latency challenges.</p> <p>The hybrid approach to BICS data access typically requires significant assistance and support from a customer’s network support team.  There may be resistance to RDC based on corporate policies related to opening up access to internal databases from external sources.</p> Melanie Mathews http://blog.performancearchitects.com/wp/?p=2031 Wed Jun 21 2017 05:07:18 GMT-0400 (EDT) ODTUG Kscope17 Livestream Sessions http://www.odtug.com/p/bl/et/blogaid=735&source=1 If you can't make it to ODTUG Kscope17, you can still participate from home. Check out the list of sessions we're bringing you live from San Antonio, Texas! ODTUG http://www.odtug.com/p/bl/et/blogaid=735&source=1 Tue Jun 20 2017 14:44:07 GMT-0400 (EDT) ODTUG Kscope17 Livestream Sessions http://www.odtug.com/p/bl/et/blogaid=730&source=1 If you can't make it to ODTUG Kscope17, you can still participate from home. Check out the list of sessions we're bringing you live from San Antonio, Texas! ODTUG http://www.odtug.com/p/bl/et/blogaid=730&source=1 Tue Jun 20 2017 10:02:17 GMT-0400 (EDT) Lift and Shift of Oracle BIAPPS Artifacts to Oracle Analytics Cloud https://blogs.oracle.com/biapps/biapps_lift_shift_to_oac <p><span style="color:#800000;"><strong>Authors: Swathi Singamareddygari , Anand Sadaiyan</strong></span></p> <p><strong><span style="color:#000080;"><strong style="font-size: 13px;">Table of Contents</strong></span></strong></p> <p><a href="#_Toc482611085">Disclaimer</a><br /> <a href="#_Toc482611086">Section:1 Deliverables</a><br /> <a href="#_Toc482611087">Section: 2 Lifting and shifting Application Roles and Web Catalogue</a><br /> <a href="#_Toc482611088">Section: 3 Lifting and Shifting Repository</a><br /> <a href="#_Toc482611089">Section: 4 FAQ</a><br /> <a href="#_Toc482611090">Section: 5 Limitations</a></p> <p><a name="_Toc482611085"></a><a name="_Toc482354171"></a><strong><a name="_Toc462761353" style="background-color: rgb(255, 255, 255);"><span style="color:#000080;">Disclaimer</span></a></strong></p> <p style="text-align: justify;">This document does not replace the&nbsp;Oracle Analytics Cloud Service Documentation Library or&nbsp;other Cloud Services documents. It serves as a supplement for Lifting and Shifting Business Intelligence Applications Artifacts (BIAPPS / OBIA) to Oracle Analytics Cloud (OAC).</p> <p style="text-align: justify;">This document is written based on the&nbsp;Oracle Analytics cloud version 17.2.1.0.0 Screenshots included in this document might differ slightly from what you see on your screen.</p> <p style="text-align: justify;"><strong><em><span style="color:#800000;">Note</span>: </em></strong>&nbsp;It is always a good practice to take a snapshot of the current environment in Oracle Analytics Cloud, before lifting and shifting the artifacts. Ensure that you create a snapshot before proceeding.&nbsp;</p> <p><strong><a name="_Toc482611086"><span style="color:#000080;">Section:1 Deliverables</span></a></strong></p> <p style="text-align: justify;">Download the Oracle Analytics Cloud Deliverables Zip file (BIAPPS_10.2.0_OACV1.zip) from the ARU21167611. (Download this patch&nbsp;<span>21167611 from Oracle Support) .&nbsp;</span>The deliverables are based on BIAPPS 10.2 release.</p> <p style="text-align: justify;"><strong>Zip file consists of</strong><br /> 1) OracleBIApps_10.2.0_OACV1.rpd (Password: welcome1)<br /> 2) BIAPPS_10.2.0_OACV1.bar (Password: Admin123)</p> <p><br /> <strong><a name="_Toc482611087"><span style="color:#000080;">Section:2 Lifting and shifting Application Roles and Web Catalogue</span></a></strong></p> <p><strong>Uploading the Application Roles and web catalogue</strong></p> <p>From the Oracle Analytics Cloud home page navigate to console and click on snapshots.</p> <ol> <li>Login to the Oracle Analytics Cloud VA page.</li> <li>From the Oracle Analytics Cloud home page, navigate to the Console and click Snapshots.</li> <li>Click Upload Snapshot to upload the delivered BAR file. <a href="https://docs.oracle.com/cloud/latest/analytics-cloud/ACABI/GUID-3F57E1E4-BFF9-4383-8999-32377E3F5B4F.htm#GUID-EA2778B4-8B7C-4135-91F0-90A223A35A80">See Uploading Snapshots</a>.</li> <li>If Virus scanner is not configured, Kindly click &ldquo;Proceed without a virus scanner&rdquo;</li> </ol> <p style="margin-left: 40px;"><img alt="" src="https://cdn.app.compendium.com/uploads/user/e7c690e8-6ff9-102a-ac6d-e4aebca50425/b8be8dec-54bb-455a-8b50-2ecd980da759/Image/c3301faef9b50a87bfbdcea9fbf3479d/image002.png" style="width: 1081px; height: 538px;" /></p> <ol> <li value="5">Select the delivered BAR file and enter &ldquo;Admin123&rdquo; as password.</li> </ol> <p style="margin-left: 40px;"><img alt="" src="https://cdn.app.compendium.com/uploads/user/e7c690e8-6ff9-102a-ac6d-e4aebca50425/b8be8dec-54bb-455a-8b50-2ecd980da759/Image/7702d1d8e3cd58ed0afb3d4febece4ea/image003.png" style="width: 1280px; height: 719px;" /></p> <ol> <li style="text-align: justify;" value="6">Select the uploaded snapshot, and click the Restore action, and in Restore Snapshot popup, select Application Roles and Catalog and click Restore to restore the snapshot for Application Roles and Web catalogue. <a href="https://docs.oracle.com/cloud/latest/analytics-cloud/ACABI/GUID-C7DE34A5-7A67-4415-98B7-1CA9E5235480.htm#GUID-C88E3DCD-8B10-4826-B0F4-EEBCFD0A2897">See Restoring Snapshots</a>.</li> </ol> <p style="margin-left: 40px;"><img alt="" src="https://cdn.app.compendium.com/uploads/user/e7c690e8-6ff9-102a-ac6d-e4aebca50425/b8be8dec-54bb-455a-8b50-2ecd980da759/Image/2f6906148c07eb0a586ed1bf6b025a4e/image004.png" style="width: 1240px; height: 705px;" /></p> <ol> <li style="text-align: justify;" value="7">Verify the imported Application Roles in Application Role Management page (Console&gt; Users and Roles-&gt; Application Roles).<br /> &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<img alt="" src="https://cdn.app.compendium.com/uploads/user/e7c690e8-6ff9-102a-ac6d-e4aebca50425/b8be8dec-54bb-455a-8b50-2ecd980da759/Image/4559fb21482a4680f4961edd827b1164/image005.png" style="width: 974px; height: 562px;" /></li> </ol> <ol> <li style="text-align: justify;" value="8">Verify the Webcat Artifacts by Clicking on dashboard menu from Classic Home.</li> </ol> <p style="margin-left: 40px;"><img alt="" src="https://cdn.app.compendium.com/uploads/user/e7c690e8-6ff9-102a-ac6d-e4aebca50425/b8be8dec-54bb-455a-8b50-2ecd980da759/Image/4dce8ae9bc847d85b29107e827737995/image006.png" style="width: 801px; height: 824px;" /></p> <p style="margin-left: 40px;"><img alt="" src="https://cdn.app.compendium.com/uploads/user/e7c690e8-6ff9-102a-ac6d-e4aebca50425/b8be8dec-54bb-455a-8b50-2ecd980da759/Image/463eaaf4fdaa34c54cb7280edec299cb/image007.png" style="width: 626px; height: 311px;" /></p> <p>&nbsp;</p> <p><strong><a name="_Toc482611088"><span style="color:#000080;">Section:3 Lifting and Shifting Repository</span></a></strong></p> <p style="text-align: justify;"><em><span style="color:#800000;"><strong>Note</strong></span></em>: Any modifications to the repository should be done in the On premise environment. No modifications are allowed in the Oracle Analytics Cloud. Allow 5 minutes of time for the Oracle Analytics Cloud environment to get refreshed after the Repository Upload.<br /> <br /> Oracle BI Applications Repository is delivered along with the Oracle Analytics Cloud deliverables zip file.</p> <p><span style="color:#000080;"><strong>Uploading the Repository:</strong></span></p> <ol> <li style="text-align: justify;">Modify the delivered repository with proper connection details (Connection pools and Schema variables OLAPTBO, CM_TBO etc)</li> </ol> <ol> <li style="text-align: justify;" value="2">Login into Oracle Analytics Cloud environment.</li> </ol> <ol> <li style="text-align: justify;" value="3">Navigate to Console from the home page and click on Snapshots.</li> </ol> <p style="margin-left: 40px;"><img alt="" src="https://cdn.app.compendium.com/uploads/user/e7c690e8-6ff9-102a-ac6d-e4aebca50425/b8be8dec-54bb-455a-8b50-2ecd980da759/Image/59226b87d6c599f1de6ed9331e63032f/image008.png" style="width: 1259px; height: 676px;" /></p> <ol> <li style="text-align: justify;" value="4">Replace the Oracle Analytics Cloud data model with on-premises repository using &ldquo;Replace Data model&rdquo; option. If Virus scanner is not configured, Kindly click &ldquo;Proceed without a virus scanner&rdquo;</li> </ol> <p style="margin-left: 40px;"><img alt="" src="https://cdn.app.compendium.com/uploads/user/e7c690e8-6ff9-102a-ac6d-e4aebca50425/b8be8dec-54bb-455a-8b50-2ecd980da759/Image/1e3a033f744f502ab06f16f9f6459884/image009.png" style="width: 1230px; height: 805px;" /></p> <ol> <li value="5">Choose the on-premises repository and provide the repository password (&quot;welcome1&quot; without quotes).</li> </ol> <p style="margin-left: 40px;"><img alt="" src="https://cdn.app.compendium.com/uploads/user/e7c690e8-6ff9-102a-ac6d-e4aebca50425/b8be8dec-54bb-455a-8b50-2ecd980da759/Image/332f046c37a3e6190d05012efb5d8525/image010.png" style="width: 1410px; height: 751px;" /></p> <ol> <li style="text-align: justify;" value="6">Verify the Uploaded RPD by Navigating to Analyses and clicking &ldquo;Create Analysis. You see the available subject areas.&nbsp;For more details on repository Lifting and Shifting refer to&nbsp; <a href="http://docs.oracle.com/cloud/latest/reportingcs_use/BILPD/GUID-2BEB60F6-986D-4A7A-9D63-EEE67083E98A.htm#BILPD-GUID-2BEB60F6-986D-4A7A-9D63-EEE67083E98A">Uploading an On-Premises Data Model to Oracle Cloud Service</a></li> </ol> <p><strong><a name="_Toc482611089"><span style="color:#000080;">Section:4 FAQ</span></a></strong></p> <ol> <li style="text-align: justify;">Custom application roles permissions are not getting applied on the Repository objects.<br /> <span style="color:#800000;"><strong>Solution</strong></span>: Create the custom application roles in VA first and the then upload the Repository.<br /> &nbsp;</li> <li style="text-align: justify;" value="2">Not able to change permissions for Webcat object and getting Assertion Failure error.</li> </ol> <p style="margin-left: 40px;"><img alt="" src="https://cdn.app.compendium.com/uploads/user/e7c690e8-6ff9-102a-ac6d-e4aebca50425/b8be8dec-54bb-455a-8b50-2ecd980da759/Image/36ed4de35ee7584c2c0974178f9421b4/image011.png" style="width: 919px; height: 205px;" /><br /> <br /> <strong><span style="color:#800000;">Solution</span>: </strong>Delete any unresolved accounts available in the webcat object and then change the permissions.</p> <p><a name="_Toc482611090"><span style="color:#000080;"><strong>Section:5 Limitation</strong></span>s</a></p> <p>Following features which are used in Oracle BI Applications are not supported in Oracle Analytics Cloud environment.<br /> <br /> Group<br /> KPI<br /> KPI Watchlist<br /> List Format<br /> Segment<br /> Scorecard</p> <p><strong>Name</strong></p> <p><strong>Path</strong></p> <p><strong>Signature</strong></p> <p>Campaign Members Load Format</p> <p>/shared/Marketing/Segmentation/List Formats/Campaign Members Load Format</p> <p>Campaign Load</p> <p>Campaign Members Suspects Load Format</p> <p>/shared/Marketing/Segmentation/List Formats/Campaign Members Suspects Load Format</p> <p>Campaign Load</p> <p>Consumer Campaign Members Load Format</p> <p>/shared/Marketing/Segmentation/List Formats/Consumer Campaign Members Load Format</p> <p>Campaign Load</p> <p>Consumer Leads Import Load Format</p> <p>/shared/Marketing/Segmentation/List Formats/Consumer Leads Import Load Format</p> <p>Campaign Load</p> <p>Leads Import Load Format</p> <p>/shared/Marketing/Segmentation/List Formats/Leads Import Load Format</p> <p>Campaign Load</p> <p>Campaign Load - Contacts and Prospects Example</p> <p>/shared/Marketing/Segmentation/List Formats/Siebel List Formats/Campaign Load - Contacts and Prospects Example</p> <p>Campaign Load</p> <p>Campaign Load - Database Writeback Example</p> <p>/shared/Marketing/Segmentation/List Formats/Siebel List Formats/Campaign Load - Database Writeback Example</p> <p>Campaign Load</p> <p>Mutual Exclusion Campaign Load - Contacts and Prospects Example</p> <p>/shared/Marketing/Segmentation/List Formats/Siebel List Formats/Mutual Exclusion Campaign Load - Contacts and Prospects Example</p> <p>Campaign Load</p> <p>Suspects Import Load Format</p> <p>/shared/Marketing/Segmentation/List Formats/Suspects Import Load Format</p> <p>Campaign Load</p> <p>All Groups</p> <p>/shared/Human Capital Management/_filters/Human Resources - Workforce Deployment/Role Dashboards/All Groups</p> <p>Group</p> <p>Below Top Performance</p> <p>/shared/Human Capital Management/_filters/Human Resources - Workforce Deployment/Role Dashboards/Below Top Performance</p> <p>Group</p> <p>Global</p> <p>/shared/Human Capital Management/_filters/Human Resources - Workforce Deployment/Role Dashboards/Global</p> <p>Group</p> <p>Top Performers</p> <p>/shared/Human Capital Management/_filters/Human Resources - Workforce Deployment/Role Dashboards/Top Performers</p> <p>Group</p> <p>Average Negotiation Cycle Time</p> <p>/shared/Procurement/Procurement Scorecard/KPIs/Customer/Average Negotiation Cycle Time</p> <p>KPI</p> <p>Fulfilled Requisition Lines past expected date</p> <p>/shared/Procurement/Procurement Scorecard/KPIs/Customer/Fulfilled Requisition Lines past expected date</p> <p>KPI</p> <p>Late Receipts</p> <p>/shared/Procurement/Procurement Scorecard/KPIs/Customer/Late Receipts</p> <p>KPI</p> <p>Processed Requisition Lines past expected date</p> <p>/shared/Procurement/Procurement Scorecard/KPIs/Customer/Processed Requisition Lines past expected date</p> <p>KPI</p> <p>Procurement Cycle Time</p> <p>/shared/Procurement/Procurement Scorecard/KPIs/Customer/Procurement Cycle Time</p> <p>KPI</p> <p>Unfulfilled Requisition Lines past expected date</p> <p>/shared/Procurement/Procurement Scorecard/KPIs/Customer/Unfulfilled Requisition Lines past expected date</p> <p>KPI</p> <p>Off-Contract Spend</p> <p>/shared/Procurement/Procurement Scorecard/KPIs/Financial/Off-Contract Spend</p> <p>KPI</p> <p>Perfect invoices</p> <p>/shared/Procurement/Procurement Scorecard/KPIs/Financial/Perfect invoices</p> <p>KPI</p> <p>Realized Cost Savings</p> <p>/shared/Procurement/Procurement Scorecard/KPIs/Financial/Realized Cost Savings</p> <p>KPI</p> <p>Invoice Automation</p> <p>/shared/Procurement/Procurement Scorecard/KPIs/Operations/Invoice Automation</p> <p>KPI</p> <p>Manual Requisition Lines Rate</p> <p>/shared/Procurement/Procurement Scorecard/KPIs/Operations/Manual Requisition Lines Rate</p> <p>KPI</p> <p>PO Transactions per Buyer</p> <p>/shared/Procurement/Procurement Scorecard/KPIs/Operations/PO Transactions per Buyer</p> <p>KPI</p> <p>Processed Negotiation Lines</p> <p>/shared/Procurement/Procurement Scorecard/KPIs/Operations/Processed Negotiation Lines</p> <p>KPI</p> <p># of Suppliers per Category</p> <p>/shared/Procurement/Procurement Scorecard/KPIs/Supplier/# of Suppliers per Category</p> <p>KPI</p> <p>% of Spend By Diversified Suppliers</p> <p>/shared/Procurement/Procurement Scorecard/KPIs/Supplier/% of Spend By Diversified Suppliers</p> <p>KPI</p> <p>On-Time Delivery performance</p> <p>/shared/Procurement/Procurement Scorecard/KPIs/Supplier/On-Time Delivery performance</p> <p>KPI</p> <p>Quality Performance</p> <p>/shared/Procurement/Procurement Scorecard/KPIs/Supplier/Quality Performance</p> <p>KPI</p> <p>Returns</p> <p>/shared/Procurement/Procurement Scorecard/KPIs/Supplier/Returns</p> <p>KPI</p> <p>Exact Match Rate</p> <p>/shared/Supply Chain Management/Analytic Library/Embedded Content/Logistics/KPIs/Exact Match Rate</p> <p>KPI</p> <p>Hit/Miss Accuracy</p> <p>/shared/Supply Chain Management/Analytic Library/Embedded Content/Logistics/KPIs/Hit\/Miss Accuracy</p> <p>KPI</p> <p>Inventory Value</p> <p>/shared/Supply Chain Management/Analytic Library/Embedded Content/Logistics/KPIs/Inventory Value</p> <p>KPI</p> <p>Average Change Order Approval Time</p> <p>/shared/Supply Chain Management/Analytic Library/Embedded Content/PIM/KPIs/Average Change Order Approval Time</p> <p>KPI</p> <p>Average Change Order Cycle Time</p> <p>/shared/Supply Chain Management/Analytic Library/Embedded Content/PIM/KPIs/Average Change Order Cycle Time</p> <p>KPI</p> <p>Average New Item Creation Approval Time</p> <p>/shared/Supply Chain Management/Analytic Library/Embedded Content/PIM/KPIs/Average New Item Creation Approval Time</p> <p>KPI</p> <p>Average New Item Creation Cycle Time</p> <p>/shared/Supply Chain Management/Analytic Library/Embedded Content/PIM/KPIs/Average New Item Creation Cycle Time</p> <p>KPI</p> <p>Percentage of Shared Categories</p> <p>/shared/Supply Chain Management/Analytic Library/Embedded Content/PIM/KPIs/Percentage of Shared Categories</p> <p>KPI</p> <p>List Export - Contacts Example</p> <p>/shared/Marketing/Segmentation/List Formats/Siebel List Formats/List Export - Contacts Example</p> <p>List Export</p> <p>Analytics Data Load - Leads Example</p> <p>/shared/Marketing/Segmentation/List Formats/Siebel List Formats/Analytics Data Load - Leads Example</p> <p>Marketing BI Data Load</p> <p>Analytics Data Load - Responses Example</p> <p>/shared/Marketing/Segmentation/List Formats/Siebel List Formats/Analytics Data Load - Responses Example</p> <p>Marketing BI Data Load</p> <p>Campaign Members Export Format</p> <p>/shared/Marketing/Segmentation/List Formats/Campaign Members Export Format</p> <p>Marketing Email Server</p> <p>Email Personalization - Contacts - OLTP Example</p> <p>/shared/Marketing/Segmentation/List Formats/Siebel List Formats/Email Personalization - Contacts - OLTP Example</p> <p>Marketing Email Server</p> <p>_CauseAndEffectLinkages</p> <p>/shared/Procurement/Procurement Scorecard/_CauseAndEffectLinkages</p> <p>Scorecard Cause And Effect Linkages</p> <p>Cause &amp; Effect Map: Improve Response Time</p> <p>/shared/Procurement/Procurement Scorecard/Cause &amp; Effect Map: Improve Response Time</p> <p>Scorecard Causes And Effects View</p> <p>Automate Invoice Processing</p> <p>/shared/Procurement/Procurement Scorecard/Automate Invoice Processing</p> <p>Scorecard Initiative</p> <p>Consolidate Supplier Base</p> <p>/shared/Procurement/Procurement Scorecard/Consolidate Supplier Base</p> <p>Scorecard Initiative</p> <p>Develop and Implement New policies to support Contract compliance</p> <p>/shared/Procurement/Procurement Scorecard/Develop and Implement New policies to support Contract compliance</p> <p>Scorecard Initiative</p> <p>Establish and Monitor SLAs</p> <p>/shared/Procurement/Procurement Scorecard/Establish and Monitor SLAs</p> <p>Scorecard Initiative</p> <p>Implement Internet Supplier Portal</p> <p>/shared/Procurement/Procurement Scorecard/Implement Internet Supplier Portal</p> <p>Scorecard Initiative</p> <p>Implement Self Service Procurement Application</p> <p>/shared/Procurement/Procurement Scorecard/Implement Self Service Procurement Application</p> <p>Scorecard Initiative</p> <p>Implement Spend Analytics</p> <p>/shared/Procurement/Procurement Scorecard/Implement Spend Analytics</p> <p>Scorecard Initiative</p> <p>Initiatives</p> <p>/shared/Procurement/Procurement Scorecard/Initiatives</p> <p>Scorecard Initiative</p> <p>Monitor Performance and provide regular feedback on quarterly basis</p> <p>/shared/Procurement/Procurement Scorecard/Monitor Performance and provide regular feedback on quarterly basis</p> <p>Scorecard Initiative</p> <p>Monitor Spend and Savings on Monthly Basis</p> <p>/shared/Procurement/Procurement Scorecard/Monitor Spend and Savings on Monthly Basis</p> <p>Scorecard Initiative</p> <p>Monitor Spend by Diversified Suppliers on monthly basis</p> <p>/shared/Procurement/Procurement Scorecard/Monitor Spend by Diversified Suppliers on monthly basis</p> <p>Scorecard Initiative</p> <p>Reward high performing employees</p> <p>/shared/Procurement/Procurement Scorecard/Reward high performing employees</p> <p>Scorecard Initiative</p> <p>_initiativeTree</p> <p>/shared/Procurement/Procurement Scorecard/_initiativeTree</p> <p>Scorecard Initiative Tree</p> <p>Mission</p> <p>/shared/Procurement/Procurement Scorecard/Mission</p> <p>Scorecard Mission</p> <p>Control Spend</p> <p>/shared/Procurement/Procurement Scorecard/Control Spend</p> <p>Scorecard Objective</p> <p>Develop and Retain Strategic Suppliers</p> <p>/shared/Procurement/Procurement Scorecard/Develop and Retain Strategic Suppliers</p> <p>Scorecard Objective</p> <p>Improve Response Time</p> <p>/shared/Procurement/Procurement Scorecard/Improve Response Time</p> <p>Scorecard Objective</p> <p>Improve Supplier Performance</p> <p>/shared/Procurement/Procurement Scorecard/Improve Supplier Performance</p> <p>Scorecard Objective</p> <p>Increase Productivity</p> <p>/shared/Procurement/Procurement Scorecard/Increase Productivity</p> <p>Scorecard Objective</p> <p>New Objective</p> <p>/shared/Procurement/Procurement Scorecard/New Objective</p> <p>Scorecard Objective</p> <p>New Objective 1</p> <p>/shared/Procurement/Procurement Scorecard/New Objective 1</p> <p>Scorecard Objective</p> <p>Procurement Scorecard</p> <p>/shared/Procurement/Procurement Scorecard/Procurement Scorecard</p> <p>Scorecard Objective</p> <p>Promote Supplier Diversity</p> <p>/shared/Procurement/Procurement Scorecard/Promote Supplier Diversity</p> <p>Scorecard Objective</p> <p>Reduce Operational Costs</p> <p>/shared/Procurement/Procurement Scorecard/Reduce Operational Costs</p> <p>Scorecard Objective</p> <p>Reduce Out-of-process Spend</p> <p>/shared/Procurement/Procurement Scorecard/Reduce Out-of-process Spend</p> <p>Scorecard Objective</p> <p>Customer</p> <p>/shared/Procurement/Procurement Scorecard/Customer</p> <p>Scorecard Perspective</p> <p>Financial</p> <p>/shared/Procurement/Procurement Scorecard/Financial</p> <p>Scorecard Perspective</p> <p>Operations</p> <p>/shared/Procurement/Procurement Scorecard/Operations</p> <p>Scorecard Perspective</p> <p>Supplier</p> <p>/shared/Procurement/Procurement Scorecard/Supplier</p> <p>Scorecard Perspective</p> <p>_Perspectives</p> <p>/shared/Procurement/Procurement Scorecard/_Perspectives</p> <p>Scorecard Perspective List</p> <p>_scorecardSettings</p> <p>/shared/Procurement/Procurement Scorecard/_scorecardSettings</p> <p>Scorecard Settings</p> <p>Strategy Map</p> <p>/shared/Procurement/Procurement Scorecard/Strategy Map</p> <p>Scorecard Strategy Map View</p> <p>_strategyTree</p> <p>/shared/Procurement/Procurement Scorecard/_strategyTree</p> <p>Scorecard Strategy Tree</p> <p>Strategy Tree</p> <p>/shared/Procurement/Procurement Scorecard/Strategy Tree</p> <p>Scorecard Strategy Tree View</p> <p>Vision</p> <p>/shared/Procurement/Procurement Scorecard/Vision</p> <p>Scorecard Vision</p> <p>Suspect Sync Segment</p> <p>/shared/Marketing/Segmentation/Segments/Suspect Sync Segment</p> <p>Segment</p> <p>Logistics KPI Watchlist</p> <p>/shared/Supply Chain Management/Analytic Library/Embedded Content/Logistics/KPIs/Logistics KPI Watchlist</p> <p>Watchlist</p> <p>PIM KPI Watchlist</p> <p>/shared/Supply Chain Management/Analytic Library/Embedded Content/PIM/KPIs/PIM KPI Watchlist</p> <p>Watchlist</p> <p>&nbsp;</p> <p><a href="https://blogs.oracle.com/biapps/biapps-on-paas" style="text-align: justify;">All blogs related to BIAPPS on PAAS</a></p> Anand Sadaiyan https://blogs.oracle.com/biapps/biapps_lift_shift_to_oac Tue Jun 20 2017 02:54:00 GMT-0400 (EDT) All You Need to Know About ODTUG Kscope17 Beacon Technology http://www.odtug.com/p/bl/et/blogaid=728&source=1 At ODTUG Kscope17, we are using wearable beacon technology to make the event better, and understand what works and what does not. ODTUG http://www.odtug.com/p/bl/et/blogaid=728&source=1 Mon Jun 19 2017 14:18:22 GMT-0400 (EDT) Unify: Could it be any easier? http://www.rittmanmead.com/blog/2017/06/unify-could-it-be-easier/ <p>Rittman Mead’s Unify is the easiest and most efficient method to pull your OBIEE reporting data directly into your local Tableau environment. No longer will you have to worry about database connection credentials, Excel exports, or any other roundabout way to get your data where you need it to be.</p> <p>Unify leverages OBIEE’s existing metadata layer to provide quick access to your curated data through a standard Tableau Web Data Connector. After a short installation and configuration process, you can be building Tableau workbooks from your OBIEE data in minutes.</p> <p>This blog post will demonstrate how intuitive and easy it is to use the Unify application. We will only cover using Unify and it’s features, as once the data gets into Tableau it can be used the same as any other Tableau Data Source. The environment shown already has Unify <a href="https://www.youtube.com/watch?v=nc-Ro258W88">installed and configured</a>, so we can jump right in and start using the tool immediately.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/ss1.png" alt=""></p> <p>To start pulling data from OBIEE using Unify, we need to create a new Web Data Connector Data Source in Tableau. This data source will prompt us for a URL to access Unify. In this instance, Unify is installed as a desktop application, so the URL is <a href="http://localhost:8080/unify">http://localhost:8080/unify</a>. </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screen-Shot-2017-06-16-at-9.33.57-AM.png" alt=""></p> <p>Once we put in the URL, we’re shown an authentication screen. This screen will allow us to authenticate against OBIEE using the same credentials. In this case, I will authenticate as the weblogic user.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screen-Shot-2017-06-16-at-9.37.19-AM.png" alt=""></p> <p>Once authenticated, we are welcomed by a window where we can construct an OBIEE query visually. On the left hand side of the application, I can select the Subject Area I wish to query, and users are shown a list of tables and columns in the selected Subject Area. There are additional options along the top of the window, and I can see all saved queries on the right hand side of the window. </p> <p>The center of the window is where we can see the current query, as well as a preview of the query results. Since I have not started building a query yet, this area is blank.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screen-Shot-2017-06-16-at-9.44.32-AM.png" alt=""></p> <p>Unify allows us to either build a new query from scratch, or select an existing OBIEE report. First, let’s build our own query. The lefthand side of the screen displays the Subject Areas and Columns which I have access to in OBIEE. With a Subject Area selected, I can drag columns, or double click them, to add them to the current query. In the screenshot above, I have added three columns to my current query, “P1 Product”, “P2 Product Type”, and “1 - Revenue”. <br> <img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screen-Shot-2017-06-16-at-9.47.52-AM.png" alt=""></p> <p>If we wanted to, we could also create new columns by defining a Column Name and Column Formula. We even have the ability to modify existing column formulas for our query. We can do this by clicking the gear icon for a specific column, or by double-clicking the grey bar at the top of the query window.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screen-Shot-2017-06-16-at-9.51.29-AM.png" alt=""></p> <p>It’s also possible to add filters to our data set. By clicking the Filter icon at the top of the window, we can view the current filters for the query. We can then add filters the same way we would add columns, by double clicking or dragging the specific column. In the example shown, I have a query on the column “D2 Department” where the column value equals “Local Plants Dept.”. </p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screen-Shot-2017-06-16-at-10.11.09-AM.png" alt=""></p> <p>Filters can be configured using any of the familiar methods, such as checking if a value exists in a list of values, numerical comparisons, or even using repository or session variables.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screen-Shot-2017-06-16-at-10.10.24-AM.png" alt=""></p> <p>Now that we have our columns selected and our filters defined, we can execute this query and see a preview of the result set. By clicking the “Table” icon in the top header of the window, we can preview the result.</p> <p>Once we are comfortable with the results of the query, we can export the results to Tableau. It is important to understand that the preview data is trimmed down to 500 rows by default, so don’t worry if you think something is missing! This value, and the export row limit, can be configured, but for now we can export the results using the green “Unify” button at the top right hand corner of the window.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screen-Shot-2017-06-16-at-10.19.27-AM.png" alt=""></p> <p>When this button is clicked, the Unify window will close and the query will execute. You will then be taken to a new Tableau Workbook with the results of the query as a Data Source. We can now use this query as a data source in Tableau, just as we would with any other data source.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screen-Shot-2017-06-16-at-10.26.41-AM.png" alt=""></p> <p>But what if we have existing reports we want to use? Do we have to rebuild the report from scratch in the web data connector? Of course not! With Unify, you can select existing reports and pull them directly into Tableau.</p> <p>Instead of adding columns from the lefthand pane, we can instead select the “Open” icon, which will let us select an existing report. We can then export this report to Tableau, just as before.</p> <p>Now let’s try to do something a little more complicated. OBIEE doesn’t have the capability to execute queries across Subject Areas without common tables in the business model, however Tableau can perform joins between two data sources (so long as we select the correct join conditions). We can use Unify to pull two queries from OBIEE from different Subject Areas, and perform a data mashup with the two Subject Areas in Tableau.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screen-Shot-2017-06-16-at-10.45.17-AM.png" alt=""></p> <p>Here I’ve created a query with “Product Number” and “Revenue”, both from the Subject Area “A - Sample Sales”. I’ve saved this query as “Sales”. I can then click the “New” icon in the header to create a new query.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screen-Shot-2017-06-16-at-10.43.30-AM.png" alt=""></p> <p>This second query is using the “C - Sample Costs” Subject Area, and is saved as “Costs”. This query contains the columns “Product Number”, “Variable Costs”, and “Fixed Costs”.</p> <p>When I click the Unify button, both of these queries will be pulled into Tableau as two separate data sources. Since both of the queries contain the “Product Number” column, I can join these data sources on the “Product Number” column. In fact, Tableau is smart enough to do this for us:</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/Screen-Shot-2017-06-16-at-10.47.24-AM.png" alt=""></p> <p>We now have two data sets, each from a different OBIEE subject area, joined and available for visualization in Tableau. Wow, that was easy!</p> <p>What about refreshing the data? Good question! The exported data sources are published as data extracts, so all you need to do to refresh the data is select the data source and hit the refresh button. If you are not authenticated with OBIEE, or your session has expired, you will simply be prompted to re-authenticate.</p> <p>Using Tableau to consume OBIEE data has never been easier. Rittman Mead’s Unify allows users to connect to OBIEE as a data source within a Tableau environment in an intuitive and efficient method. If only everything was this easy!</p> <p>Interested in getting OBIEE data into Tableau? <a href="mailto:info+unifynp@rittmanmead.com" target="_blank">Contact us</a> to see how we can help, or head over to <a href="https://unify.ritt.md">https://unify.ritt.md</a> to get a free Unify trial version.</p> Nick Padgett 3dbe3dfe-aea3-4302-8cc2-2f95c1e57805 Mon Jun 19 2017 10:00:00 GMT-0400 (EDT) Unify - An Insight Into the Product http://www.rittmanmead.com/blog/2017/06/unify-an-insight-into-the-product/ <img src="http://www.rittmanmead.com/blog/content/images/2017/06/endtoend.jpg" alt="Unify - An Insight Into the Product"><p><a href="https://www.rittmanmead.com/blog/2017/06/unify-see-your-data-from-every-perspective/">Monday, 12 Jun</a> saw the official release of <a href="https://unify.ritt.md">Unify</a>, Rittman Mead's very own connector between Tableau and OBIEE. It provides a simple but powerful integration between the two applications that allows you to execute queries through OBIEE and manipulate and render the datasets using Tableau.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/unify.png" alt="Unify - An Insight Into the Product"></p> <h1 id="whywemadeit">Why We Made It</h1> <p>One of the first questions of course would be <em>why</em> we would want to do this in the first place. The excellent thing about OBI is that it acts as an abstraction layer on top of a database, allowing analysts to write efficient and secure reports without going into the detail of writing queries. As with any abstraction, it is a trade of simplicity for capability. Products like Tableau and Data Visualiser seek to reverse this trade, putting the power back in the hands of the report builder. However, without quoting Spiderman, care should be taken when doing this. </p> <p>The result can be that users write inefficient queries, or worse still, incorrect ones. We know there will be some out there that use self service tools as purely a visualisation engine, simply dropping pre-made datasets into it. If you are looking to produce sustainable, scalable and accessible reporting systems, you need to tackle the problem both at the data acquisition stage as well as the communication stage at the end.</p> <p>If you are already meeting both requirements, perhaps by using OBI with Data Visualiser (formerly Visual Analyser) or by other means then that's perfectly good. However, We know from experience that there are many of you out there that have already invested heavily into both OBI and Tableau as separate solutions. Rather than have them linger in a state of conflict, we'd rather we nurse them into a state of symbiosis.</p> <p>The idea behind Unify is that it bridges this gap, allowing you to use your OBIEE system as an efficient data acquisition platform and Tableau as an intuitive playground for users who want to do a a bit more with their data. Unify works by using the Tableau Web Data Connector as a data source and then our customised software to act as an interface for creating OBIEE queries and them exporting them into Tableau.</p> <h1 id="howitworks">How It Works</h1> <p>Unify uses Tableau's latest <a href="https://www.tableau.com/about/blog/2015/8/connect-just-about-any-web-data-new-web-data-connector-42246">Web Data Connector</a> data source to allow us to dynamically query OBIEE and extract data into Tableau. Once a dataset is extracted into Tableau, it can be used with Tableau as normal, taking advantages of all of the powerful features of Tableau. This native integration means you can add in OBIEE data sources just as you would add in any others - Excel files, SQL results etc. Then you can join the data sources using Tableau itself, even if the data sources don't join up together in the background.</p> <p>First you open up Tableau and add a Web Data Connector source:</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/1-tableau.png" alt="Unify - An Insight Into the Product"></p> <p>Then give the link to the Unify application, e.g. <code>http://localhost:8080/unify</code>. This will open up Unify and prompt you to login with your OBIEE credentials. This is important as Unify operates through the OBIEE server layer in order to maintain all security permissions that you've already defined.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/2-login.png" alt="Unify - An Insight Into the Product"></p> <p>Now that the application is open, you can make OBIEE queries using the interface provided. This is a bit like Answers and allows you to query from any of your available subject areas and presentation columns. The interface also allows you to use filtering, column formulae and OBIEE variables much in the same way as Answers does. </p> <p>Alternatively, you can open up an existing report that you've made in OBIEE and then edit it at your leisure. Unify will display a preview of the dataset so you can tweak it until you are happy that is what you want to bring into Tableau.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/3-query.png" alt="Unify - An Insight Into the Product"></p> <p>Once you're happy with your dataset, click the <strong>Unify</strong> button in the top right and it will export the data into Tableau. From this point, it behaves exactly as Tableau does with any other data set. This means you can join your OBIEE dataset to external sources, or bring in queries from multiple subject areas from OBIEE and join them in Tableau. Then of course, take advantage of Tableau's powerful and interactive visualisation engine.</p> <p><img src="http://www.rittmanmead.com/blog/content/images/2017/06/4-visual.png" alt="Unify - An Insight Into the Product"></p> <h1 id="unifyserver">Unify Server</h1> <p>Unify comes in <a href="https://unify.ritt.md/unify/desktop">desktop</a> and <a href="https://unify.ritt.md/unify/server">server</a> flavours. The main difference between the two is that the server version allows you to upload Tableau workbooks with OBIEE data to Tableau Server <em>and</em> refresh them. With the desktop version, you will only be able to upload static workbooks that you've created, however with the server version of Unify, you can tell Tableau Server to refresh data from OBIEE in accordance with a schedule. This lets you produce production quality dashboards for your users, sourcing data from OBIEE as a well as any other source you choose.</p> <h1 id="unifyyourdata">Unify Your Data</h1> <p>In a nutshell, Unify allows you to combine the best aspects of two very powerful BI tools and will prevent the need for building all of your reporting artefacts from scratch if you already have a good, working system.</p> <p>I hope you've found this brief introduction to Unify informative and if you have OBIEE and would like to try it with Tableau, I encourage you to register for a <a href="https://unify.ritt.md/register">free desktop trial</a>. If you have any questions, please don't hesitate to <a href="mailto:unify@rittmanmead.com">get in touch</a>.</p> Minesh Patel 65721995-1ce4-4dfb-ace8-fb1fa6a78851 Thu Jun 15 2017 07:00:00 GMT-0400 (EDT) Giddy Up — Red Pill is Headed to Texas https://medium.com/red-pill-analytics/giddy-up-red-pill-is-headed-to-texas-b28c28198c59?source=rss----abcc62a8d63e---4 <figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*zxwx1rrhRnezrMnRrsukJQ.jpeg" /></figure><h4>Kscope17 — San Antonio, TX</h4><p>The countdown is on for <a href="http://kscope17.com/">ODTUG’s Kscope17 in San Antonio, Texas</a> and there is a packed lineup of impressive content, fun events, and a focus on emerging technologies.</p><p>We know it will be tough to decide which sessions to attend but make sure you save some time to check out Red Pill Analytic’s sessions, listed below.</p><h3><strong>Analyze This</strong></h3><p>Red Pill Analytics’ reach will also extend outside of the classroom this year. As Kscope17’s Analytics Sponsor, we will be using live polling, IOT technologies, and beacon data to paint a picture of the conference in real-time. We will answer questions like: Which sessions are best attended? Which location is the busiest? How many sessions are people attending? Analytics will be on display throughout the conference venue. Visit us near registration and at displays near session rooms in the Grand Oaks Foyer and Wildflower Hallway as we dial you into the fun with live polling. All we’re offering is the truth. Nothing more.</p><p>We will also host a special session about the data gathering process on Tuesday: <a href="http://kscope17.com/component/seminar/seminarslist#A%20Lambda%20Architecture%20in%20the%20Cloud:%20A%20Kscope17%20Case%20Study">A Lambda Architecture in the Cloud: A Kscope17 Case Study</a>. We look forward to sharing more with you about the analytics we gather throughout the weeks and the unique and innovative ways we are using that data to tell a story.</p><h3>Don’t be bound by conference tracks</h3><p>In addition to our sponsorship, Red Pill Analytics has three speakers delivering sessions at Kscope17. Why should you attend a Red Pill Analytics Business Intelligence/Big Data session at Kscope17? Especially if these sessions do not fall in the track you are planning on attending? The ability to communicate with data in a visual way is a skill that is critical in any professional’s toolbelt. Are you interested in learning more about Oracle Data Visualization? Or are you in a pattern of connecting to an Essbase cube, pulling down information in Excel and mashing different spreadsheets together? Then it is imperative that you attend one of our Data Visualization sessions at <a href="http://kscope17.com/">Kscope17</a> and to learn to combine these processes in one place using Oracle Analytics Cloud (OAC).</p><h3>Where can you Find us?</h3><p>Check out these Red Pill Analytics sessions at Kscope and swing by our Analytics Stations. <em>(Please note as with any conference schedule, times may change. Make sure to check out the Kscope17 app for the most up-to-date information.)</em></p><p>Will you be at Kscope17 and want to meet up? <a href="http://redpillanalytics.com/contact/">Contact us</a> and let’s talk analytics.</p><p>We are looking forward to this event and all of the other great opportunities to speak. Make sure to keep an eye on our <a href="http://events.redpillanalytics.com">events</a> page for what Red Pill Analytics is up to this year.</p><p><strong>Sunday<br>8:30 AM — 4:30 PM</strong> <a href="http://kscope17.com/content/sunday-symposiums#BI">Sunday Symposium</a> <br><strong>8:30–9:00 PM</strong> <a href="http://kscope17.com/events/geek-game-night">Geek Game Night</a></p><p><strong>Monday<br>8:00–10:00 PM: </strong><a href="http://kscope17.com/events/daily-events"><strong>Community Night Event: BI Texas-Style Trivia</strong></a></p><p><strong>Tuesday<br>12:45–1:45 PM: </strong><a href="http://kscope17.com/events/lunch-learn"><strong>Lunch and Learn Panels</strong></a><br><strong>Topics: </strong><br>DATA WAREHOUSING &amp; BIG DATA, Stewart Bryson<br>BI &amp; REPORTING, Michelle Kolbe<br>DATA VISUALIZATION &amp; ADVANCED ANALYTICS, Kevin McGinley</p><h3>Must See Sessions</h3><blockquote><strong>MONDAY</strong></blockquote><p><a href="http://kscope17.com/component/seminar/seminarslist#Architecture%20Live:%20Designing%20an%20Analytics%20Platform%20for%20the%20Big%20Data%20Era"><strong>Architecture Live: Designing an Analytics Platform for the Big Data Era</strong></a><strong><br></strong><a href="http://kscope17.com/component/seminar/presenterlist?last_id=75"><strong>Jean-Pierre Dijcks</strong></a><strong><em>, </em></strong>Oracle Corporation<strong><em><br></em>Co-presenter(s):</strong> <a href="http://kscope17.com/component/seminar/presenterlist?last_id=72"><strong>Stewart Bryson</strong></a><strong>,</strong> Red Pill Analytics<br><strong>When:</strong> June 26 — Monday: Session 1 , 10:30-11:30 am<br><strong>Topic:</strong> Data Warehousing &amp; Big Data — <strong>Subtopic:</strong> Data Architecture</p><p>Don’t miss the Architecture Live experience! In this interactive session, you’ll witness two industry experts digitally illustrating data-driven architectures live, with input and feedback from the audience.</p><p>Kafka, Lambda, and Streaming Analytics will be all covered. We’ll tell you what these words mean and more importantly how they affect the choices we make building an enterprise architecture. With the Oracle Information Management Reference Architecture as the backdrop, we’ll clarify and delineate the different components involved in delivering big data, fast data, and all the gray area in between. The Architecture Live experience will be fun and different, and we’ll all learn something along the way.</p><p><a href="http://kscope17.com/component/seminar/seminarslist#Kafka,%20Data%20Streaming,%20and%20Analytic%20Microservices"><strong>Kafka, Data Streaming and Analytic Microservices</strong></a><strong><br></strong><a href="http://kscope17.com/component/seminar/presenterlist?last_id=76"><strong>Stewart Bryson</strong></a><strong><em>, Red Pill Analytics<br></em>When:</strong> June 26 — Monday: Session 2 , 11:45 am — 12:45 pm<br><strong>Topic:</strong> Data Warehousing &amp; Big Data — <strong>Subtopic:</strong> Data Architecture</p><p>While traditional data warehouses excel at sourcing data from enterprise applications, they usually fail at handling the volume, velocity, and variety of data for modern analytics applications relying on big and fast data. Instead of modeling these data sources into a system that doesn’t fit, let’s apply a new software design pattern to analytics: microservices. Microservices are small, independent applications — building blocks that provide only a distinct subset of functionality — that can be stacked together to build an end-to-end platform.</p><p>In this presentation, we’ll explore using Apache Kafka and the Confluent Platform 3.0 as the data streaming hub for ingesting data bound for downstream analytic applications: an enterprise data warehouse, a Hadoop cluster for batch processing, and lightweight, purpose-built microservices in the cloud or on-premises. Experience the next generation of analytic platforms.</p><p><a href="http://kscope17.com/component/seminar/seminarslist#Oracle%20Data%20Visualization%20for%20the%20Finance%20Analyst"><strong>Oracle Data Visualization for the Finance Analyst</strong></a><strong><br></strong><a href="http://kscope17.com/component/seminar/presenterlist?last_id=66"><strong>Kevin McGinley</strong></a><strong><em>, </em></strong>Red Pill Analytics<strong><em><br></em>Co-presenter(s):</strong> <a href="http://kscope17.com/component/seminar/presenterlist?last_id=170"><strong>Tim German</strong></a>, Qubix<br><strong>When:</strong> June 26 — Monday: Session 3 , 2:00–3:00 pm<br><strong>Topic:</strong> Data Visualization &amp; Advanced Analytics — <strong>Subtopic:</strong> Oracle Data Visualization</p><p>Many analysts within Finance are used to manipulating spreadsheets and waiting for enhancements to Essbase cubes to produce reports that need to be shared with their management or peers. This session will demonstrate how all analysts within Finance can get immediate value from Oracle Data Visualization (DV) and decrease their reliance on overly complex spreadsheets. From its ability to connect to many different kinds of data sources, wrangle multiple data sources into a usable format, and visualize insights that would be otherwise hard to see in a table, Oracle DV provides analysts an extra layer of functionality they can easily learn and use without IT intervention.</p><p><a href="http://kscope17.com/component/seminar/seminarslist#Using%20R%20for%20Data%20Profiling"><strong>Using R for Data Profiling</strong></a><strong><br></strong><a href="http://kscope17.com/component/seminar/presenterlist?last_id=43"><strong>Michelle Kolbe</strong></a><strong><em>, Red Pill Analytics<br></em>When:</strong> June 26 — Monday: Session 3 , 2:00-3:00 pm<br><strong>Topic:</strong> BI &amp; Reporting — <strong>Subtopic:</strong> Other BI and Reporting</p><p>The benefits of knowing your data before embarking on a BI project are endless. Sure, you can buy a tool to help with this, or you could use R, an open-source tool. This session will dig into methods for using R to connect to your data source to see visual and tabular analyses of your data set. You’ll learn how to find missing data, outliers, and unexpected values. If you don’t know R or you are wanting to learn more functions within R, you’ll benefit from this session.</p><blockquote><strong>TUESDAY</strong></blockquote><p><a href="http://kscope17.com/component/seminar/seminarslist#A%20Lambda%20Architecture%20in%20the%20Cloud:%20A%20Kscope17%20Case%20Study"><strong>A Lambda Architecture in the Cloud: A Kscope17 Case Study</strong></a><strong><br></strong><a href="http://kscope17.com/component/seminar/presenterlist?last_id=72"><strong>Stewart Bryson</strong></a><strong>,</strong> Red Pill Analytics and <a href="http://kscope17.com/component/seminar/presenterlist?last_id=69"><strong>Kevin McGinley</strong></a><strong>, </strong>Red Pill Analytics<br><strong>When: </strong>Jun 27, 2017, Tuesday: Session 8, 2:00–3:00 pm<strong><br>Topic: </strong>Data Visualization &amp; Advanced Analytics <strong>Subtopic: </strong>Other</p><p>A Lambda Architecture enables data-driven organizations by simultaneously providing batch and speed processing layers to satisfy the overall appetite for analytics and reporting. But building a Lambda architecture is not easy, usually requiring all of the following: a universal ingestion layer, an immutable data store as a system of record, one or more data processing layers that can satisfy batch and speed requirements, and a serving layer that enables data-driven decision making.</p><p>In this session, we’ll demonstrate how Cloud platforms can supercharge the delivery of a capable Lambda architecture. Our case study will be the IoT data generated by Kscope17 attendees including the beacon from their badges, as well as other devices capturing the results of live polling.</p><p><a href="http://kscope17.com/component/seminar/seminarslist#Expanding%20Your%20Data-Driven%20Story:%20The%20Next%20Chapter"><strong>Expanding Your Data-Driven Story: The Next Chapter</strong></a><strong><br></strong><a href="http://kscope17.com/component/seminar/presenterlist?last_id=70"><strong>Mike Durran</strong></a><strong><em>, Oracle Corporation<br></em>Co-presenter(s):</strong> <a href="http://kscope17.com/component/seminar/presenterlist?last_id=75"><strong>Stewart Bryson</strong></a>, Red Pill Analytics<br><strong>When:</strong> June 27 — Tuesday: Session 9, 3:30-4:30 pm<br><strong>Topic:</strong> Data Visualization &amp; Advanced Analytics — <strong>Subtopic:</strong> Oracle Data Visualization</p><p>Oracle Data Visualization (DV) makes it easy to get insight from your data. This stunningly visual and intuitive product enables you to access, blend, and wrangle a variety of sources — including spreadsheets, databases, and applications — and tell the story of your data. In this session, learn about the power of data storytelling and the latest capabilities of Oracle DV (including details of product roadmap) to create compelling analytic narratives, including how you can rapidly apply advanced analytic techniques to gain insights previously only accessible to advanced users. Learn about how Oracle DV has been used in real-life scenarios to gain insight and improve business performance.</p><blockquote><strong>WEDNESDAY</strong></blockquote><p><a href="http://kscope17.com/component/seminar/seminarslist#Oracle%20DV%20for%20the%20Finance%20Analyst%20Hands%20on%20Lab"><strong>Hands-on Training: Oracle DV for the Finance Analyst</strong></a><strong><br></strong><a href="http://kscope17.com/component/seminar/presenterlist?last_id=69"><strong>Kevin McGinley</strong></a><strong>, </strong>Red Pill Analytics and<strong> </strong><a href="http://kscope17.com/component/seminar/presenterlist?last_id=174"><strong>Tim German</strong></a><strong>, </strong>Qubix<br><strong>When: <br></strong>Wednesday, June 28, 2017, 9:45-11:15 AM <br>Wednesday, June 28, 2017, 1:45-3:15 PM</p><p>This hands-on-lab will build upon the session given by Kevin McGinley and Tim German by allowing attendees to perform some of the demonstrations shown in the session given earlier in the week. Attendees will get to use Oracle Data Visualization against Essbase cubes, Excel spreadsheets, and even learn how to create their own mashups of data to be used for their own analytical purposes. They’ll also learn how building certain types of visualizations and using features like narrative mode can help deepen their analysis and make the communication of their findings easier. Prior attendance of the session is not required to attend the hands-on-lab.</p><p><strong>Trends in the World of Analytics, Business Intelligence, and Performance Management Panel Session Moderated by </strong><a href="http://kscope17.com/component/seminar/presenterlist?last_id=66"><strong>Edward Roske</strong></a><strong> <em>, interRel Consulting<br></em></strong><a href="http://kscope17.com/component/seminar/presenterlist?last_id=72"><strong>Stewart Bryson</strong></a><strong>,</strong> Red Pill Analytics<br><strong>When:</strong> Jun 28, 2017, Wednesday Session 14 , 1:45 pm — 2:45 pm<br><strong>Topic:</strong> BI &amp; Reporting — <strong>Subtopic:</strong> Other BI and Reporting</p><p>There has never been a panel assembled with as many luminaries in the world of BI, EPM, and business analytics as you’ll see on this stage. Each one of these people has over 20 years of experience and collectively, they’ve been involved in more than 1,000 implementations. But they won’t be talking technical tips: with their wealth of experience, they’ll be discussing trends in the bigger world of analytics. Which products are rising up, where are companies investing their money, what new areas are emerging, and much, much more will be discussed as these gurus descend from their metaphorical mountains to discuss and debate for your amusement and education. If you want to know what the reporting, analysis, planning, and consolidation fields are up to, come with plenty of questions and an open mind.</p><blockquote><strong>THURSDAY</strong></blockquote><p><a href="http://kscope17.com/content/thursday-deep-dive-sessions"><strong>Deep Dive Session: Navigating the Oracle Business Analytics Frontier</strong></a><strong><br></strong><a href="http://kscope17.com/component/seminar/presenterlist?last_id=66"><strong>Kevin McGinley</strong></a><strong>, </strong>Red Pill Analytics<strong><em><br></em>Co-presenter(s):</strong> <a href="http://kscope17.com/component/seminar/presenterlist?last_id=221"><strong>Tracy McMullen</strong></a><strong>,</strong> interRel Consulting<br><strong>When:</strong> June 29 — Deep-Dive Session, 9:00-11:00 am<br><strong>Topic:</strong> BI &amp; Reporting — <strong>Subtopic:</strong> Other BI and Reporting</p><p>Saddle up and dig in your spurs as we trail blaze through Oracle’s Reporting, Business Intelligence, and Data Visualization solutions. Through a rotating panel of experts from Oracle, partners, and customers and interactive discussions with attendees, we’ll navigate Reporting and BI challenges and how Oracle Business Analytics addresses those requirements. Led by moderators Kevin McGinley and Tracy McMullen, the panel will discuss questions such as, “Should I use Smart View or Data Visualization or Oracle Analytics Cloud?”, “How do these solutions work together, and when should I use what?” and, “What are the considerations for moving to the Cloud?” Our panel will share thoughts and perspectives on today’s reporting, BI, and DV questions, climate, and trends. We reckon you won’t want to miss this EPM and BI reporting rodeo in Thursday’s Deep Dive Session.</p><p><a href="http://kscope17.com/content/thursday-deep-dive-sessions"><strong>The Great Debate: Where Should My Data Warehouse Live?</strong></a><strong><br></strong><a href="http://kscope17.com/component/seminar/presenterlist?last_id=85"><strong>Michael Rainey</strong></a>, Moderator, Gluent<br><strong>When:</strong> June 29 — Deep-Dive Session, 9:00–11:00 am<br><strong>Topic:</strong> Data Warehousing &amp; Big Data — <strong>Subtopic:</strong> Data Architecture<br>Panelists include: <br><a href="http://kscope17.com/component/seminar/presenterlist?last_id=77"><strong>Stewart Bryson</strong></a><strong>,</strong> Red Pill Analytics<br><a href="http://kscope17.com/component/seminar/presenterlist?last_id=81"><strong>Holger Friedrich</strong></a><strong>, </strong>sumIT AG<strong><em><br></em></strong><a href="http://kscope17.com/component/seminar/presenterlist?last_id=75"><strong>Antony Heljula</strong></a><strong>, </strong>Peak Indicators Ltd<br><a href="http://kscope17.com/component/seminar/presenterlist?last_id=105"><strong>Kent Graziano</strong></a><strong>, </strong>Snowflake Computing</p><p>The long standing debate of running your data warehouse on premises versus in the cloud continues at the KScope17 Big Data and Data Warehousing Thursday Deep Dive session. Whether built in a traditional, relational database or constructed from “schema on read” data in Hadoop, the recent rise of cloud services over the past few years has led data architects, IT directors, and CIO’s to ask the question: “Where should my data warehouse live?” Several experts in the Oracle data warehousing field will provide arguments for their preferred approach, while attempting to refute evidence presented for the alternative solution. Along with what should be a lively and engaging debate, the experts will join together to answer any questions you may have around big data, data warehousing, and data integration in general. Don’t miss this great debate and Q&amp;A session!</p><p>Will you be at Kscope17 and want to meet up? <a href="mailto: lauren@redpillanalytics.com">Contact us</a> and let’s get a drink!</p><p>We are looking forward to this event and all of the other great opportunies to speak. Make sure to keep an eye on our <a href="http://events.redpillanalytics.com">events</a> page for what Red Pill Analytics is up to this year.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=b28c28198c59" width="1" height="1"><hr><p><a href="https://medium.com/red-pill-analytics/giddy-up-red-pill-is-headed-to-texas-b28c28198c59">Giddy Up — Red Pill is Headed to Texas</a> was originally published in <a href="https://medium.com/red-pill-analytics">Red Pill Analytics</a> on Medium, where people are continuing the conversation by highlighting and responding to this story.</p> Lauren Prezby https://medium.com/p/b28c28198c59 Wed Jun 14 2017 08:56:18 GMT-0400 (EDT) Giddy Up — Red Pill is Headed to Texas http://redpillanalytics.com/kscope17/ <p><img width="300" height="201" src="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/melissa-newkirk-194315.jpg?fit=300%2C201" class="attachment-medium size-medium wp-post-image" alt="Kscope17 Event Details" srcset="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/melissa-newkirk-194315.jpg?w=1920 1920w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/melissa-newkirk-194315.jpg?resize=300%2C201 300w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/melissa-newkirk-194315.jpg?resize=768%2C514 768w, https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/melissa-newkirk-194315.jpg?resize=1024%2C685 1024w" sizes="(max-width: 300px) 100vw, 300px" data-attachment-id="4975" data-permalink="http://redpillanalytics.com/kscope17/melissa-newkirk-194315/" data-orig-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/melissa-newkirk-194315.jpg?fit=1920%2C1285" data-orig-size="1920,1285" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="Kscope17 Event Details" data-image-description="&lt;p&gt;Kscope17 Event Details&lt;/p&gt; " data-medium-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/melissa-newkirk-194315.jpg?fit=300%2C201" data-large-file="https://i2.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/melissa-newkirk-194315.jpg?fit=1024%2C685" /></p><p class="graf graf--h3">The countdown is on for <a class="markup--anchor markup--p-anchor" href="http://kscope17.com/" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/">ODTUG’s Kscope17 in San Antonio, Texas</a> and there is a packed lineup of impressive content, fun events, and a focus on emerging technologies.</p> <p class="graf graf--p">We know it will be tough to decide which sessions to attend but make sure you save some time to check out Red Pill Analytic’s sessions, listed below.</p> <h2 class="graf graf--h3">Analyze This</h2> <p class="graf graf--p">Red Pill Analytics’ reach will also extend outside of the classroom this year. As Kscope17’s Analytics Sponsor, we will be using live polling, IOT technologies, and beacon data to paint a picture of the conference in real-time. We will answer questions like: Which sessions are best attended? Which location is the busiest? How many sessions are people attending? Analytics will be on display throughout the conference venue. Visit us near registration and at displays near session rooms in the Grand Oaks Foyer and Wildflower Hallway as we dial you into the fun with live polling. All we’re offering is the truth. Nothing more.</p> <p class="graf graf--p">We will also host a special session about the data gathering process on Tuesday: <a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/seminarslist#A%20Lambda%20Architecture%20in%20the%20Cloud:%20A%20Kscope17%20Case%20Study" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/seminarslist#A%20Lambda%20Architecture%20in%20the%20Cloud:%20A%20Kscope17%20Case%20Study">A Lambda Architecture in the Cloud: A Kscope17 Case Study</a>. We look forward to sharing more with you about the analytics we gather throughout the weeks and the unique and innovative ways we are using that data to tell a story.</p> <h2 class="graf graf--h3">Don’t be bound by conference tracks</h2> <p class="graf graf--p">In addition to our sponsorship, Red Pill Analytics has three speakers delivering sessions at Kscope17. Why should you attend a Red Pill Analytics Business Intelligence/Big Data session at Kscope17? Especially if these sessions do not fall in the track you are planning on attending? The ability to communicate with data in a visual way is a skill that is critical in any professional’s toolbelt. Are you interested in learning more about Oracle Data Visualization? Or are you in a pattern of connecting to an Essbase cube, pulling down information in Excel and mashing different spreadsheets together? Then it is imperative that you attend one of our Data Visualization sessions at <a class="markup--anchor markup--p-anchor" href="http://kscope17.com/" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/">Kscope17</a> and to learn to combine these processes in one place using Oracle Analytics Cloud (OAC).</p> <h2 class="graf graf--h3">Where can you Find us?</h2> <p class="graf graf--p">Check out these Red Pill Analytics sessions at Kscope and swing by our Analytics Stations. <em class="markup--em markup--p-em">(Please note as with any conference schedule, times may change. Make sure to check out the Kscope17 app for the most up-to-date information.)</em></p> <p class="graf graf--p">Will you be at Kscope17 and want to meet up? <a class="markup--anchor markup--p-anchor" href="http://redpillanalytics.com/contact/" target="_blank" rel="noopener noreferrer" data-href="http://redpillanalytics.com/contact/">Contact us</a> and let’s talk analytics.</p> <p class="graf graf--p">We are looking forward to this event and all of the other great opportunities to speak. Make sure to keep an eye on our <a class="markup--anchor markup--p-anchor" href="http://events.redpillanalytics.com" target="_blank" rel="noopener noreferrer" data-href="http://events.redpillanalytics.com">events</a> page for what Red Pill Analytics is up to this year.</p> <p class="graf graf--p"><strong class="markup--strong markup--p-strong">Sunday<br /> 8:30 AM — 4:30 PM</strong> <strong><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/content/sunday-symposiums#BI" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/content/sunday-symposiums#BI">Sunday Symposium</a></strong><br /> <strong class="markup--strong markup--p-strong">8:30–9:00 PM</strong> <strong><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/events/geek-game-night" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/events/geek-game-night">Geek Game Night</a></strong></p> <p class="graf graf--p"><strong class="markup--strong markup--p-strong">Monday<br /> 8:00–10:00 PM: </strong><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/events/daily-events" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/events/daily-events"><strong class="markup--strong markup--p-strong">Community Night Event: BI Texas-Style Trivia</strong></a></p> <p class="graf graf--p"><strong class="markup--strong markup--p-strong">Tuesday<br /> 12:45–1:45 PM: </strong><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/events/lunch-learn" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/events/lunch-learn"><strong class="markup--strong markup--p-strong">Lunch and Learn Panels</strong></a><br /> <strong class="markup--strong markup--p-strong">Topics: </strong></p> <ul> <li class="graf graf--p">DATA WAREHOUSING &amp; BIG DATA, Stewart Bryson</li> <li class="graf graf--p">BI &amp; REPORTING, Michelle Kolbe</li> <li class="graf graf--p">DATA VISUALIZATION &amp; ADVANCED ANALYTICS, Kevin McGinley</li> </ul> <h3 class="graf graf--h3">Must See Sessions</h3> <blockquote class="graf graf--blockquote"><p><strong class="markup--strong markup--blockquote-strong">MONDAY</strong></p></blockquote> <p class="graf graf--p"><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/seminarslist#Architecture%20Live:%20Designing%20an%20Analytics%20Platform%20for%20the%20Big%20Data%20Era" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/seminarslist#Architecture%20Live:%20Designing%20an%20Analytics%20Platform%20for%20the%20Big%20Data%20Era"><strong class="markup--strong markup--p-strong">Architecture Live: Designing an Analytics Platform for the Big Data Era</strong></a><strong class="markup--strong markup--p-strong"><br /> </strong><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=75" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=75"><strong class="markup--strong markup--p-strong">Jean-Pierre Dijcks</strong></a><strong class="markup--strong markup--p-strong"><em class="markup--em markup--p-em">, </em></strong>Oracle Corporation<strong class="markup--strong markup--p-strong"><em class="markup--em markup--p-em"><br /> </em>Co-presenter(s):</strong> <a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=72" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=72"><strong class="markup--strong markup--p-strong">Stewart Bryson</strong></a><strong class="markup--strong markup--p-strong">,</strong> Red Pill Analytics<br /> <strong class="markup--strong markup--p-strong">When:</strong> June 26 — Monday: Session 1 , 10:30-11:30 am<br /> <strong class="markup--strong markup--p-strong">Topic:</strong> Data Warehousing &amp; Big Data — <strong class="markup--strong markup--p-strong">Subtopic:</strong> Data Architecture</p> <p class="graf graf--p">Don’t miss the Architecture Live experience! In this interactive session, you’ll witness two industry experts digitally illustrating data-driven architectures live, with input and feedback from the audience.</p> <p class="graf graf--p">Kafka, Lambda, and Streaming Analytics will be all covered. We’ll tell you what these words mean and more importantly how they affect the choices we make building an enterprise architecture. With the Oracle Information Management Reference Architecture as the backdrop, we’ll clarify and delineate the different components involved in delivering big data, fast data, and all the gray area in between. The Architecture Live experience will be fun and different, and we’ll all learn something along the way.</p> <hr /> <p class="graf graf--p"><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/seminarslist#Kafka,%20Data%20Streaming,%20and%20Analytic%20Microservices" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/seminarslist#Kafka,%20Data%20Streaming,%20and%20Analytic%20Microservices"><strong class="markup--strong markup--p-strong">Kafka, Data Streaming and Analytic Microservices</strong></a><strong class="markup--strong markup--p-strong"><br /> </strong><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=76" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=76"><strong class="markup--strong markup--p-strong">Stewart Bryson</strong></a><strong class="markup--strong markup--p-strong"><em class="markup--em markup--p-em">, Red Pill Analytics<br /> </em>When:</strong> June 26 — Monday: Session 2 , 11:45 am — 12:45 pm<br /> <strong class="markup--strong markup--p-strong">Topic:</strong> Data Warehousing &amp; Big Data — <strong class="markup--strong markup--p-strong">Subtopic:</strong> Data Architecture</p> <p class="graf graf--p">While traditional data warehouses excel at sourcing data from enterprise applications, they usually fail at handling the volume, velocity, and variety of data for modern analytics applications relying on big and fast data. Instead of modeling these data sources into a system that doesn’t fit, let’s apply a new software design pattern to analytics: microservices. Microservices are small, independent applications — building blocks that provide only a distinct subset of functionality — that can be stacked together to build an end-to-end platform.</p> <p class="graf graf--p">In this presentation, we’ll explore using Apache Kafka and the Confluent Platform 3.0 as the data streaming hub for ingesting data bound for downstream analytic applications: an enterprise data warehouse, a Hadoop cluster for batch processing, and lightweight, purpose-built microservices in the cloud or on-premises. Experience the next generation of analytic platforms.</p> <hr /> <p class="graf graf--p"><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/seminarslist#Oracle%20Data%20Visualization%20for%20the%20Finance%20Analyst" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/seminarslist#Oracle%20Data%20Visualization%20for%20the%20Finance%20Analyst"><strong class="markup--strong markup--p-strong">Oracle Data Visualization for the Finance Analyst</strong></a><strong class="markup--strong markup--p-strong"><br /> </strong><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=66" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=66"><strong class="markup--strong markup--p-strong">Kevin McGinley</strong></a><strong class="markup--strong markup--p-strong"><em class="markup--em markup--p-em">, </em></strong>Red Pill Analytics<strong class="markup--strong markup--p-strong"><em class="markup--em markup--p-em"><br /> </em>Co-presenter(s):</strong> <a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=170" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=170"><strong class="markup--strong markup--p-strong">Tim German</strong></a>, Qubix<br /> <strong class="markup--strong markup--p-strong">When:</strong> June 26 — Monday: Session 3 , 2:00–3:00 pm<br /> <strong class="markup--strong markup--p-strong">Topic:</strong> Data Visualization &amp; Advanced Analytics — <strong class="markup--strong markup--p-strong">Subtopic:</strong> Oracle Data Visualization</p> <p class="graf graf--p">Many analysts within Finance are used to manipulating spreadsheets and waiting for enhancements to Essbase cubes to produce reports that need to be shared with their management or peers. This session will demonstrate how all analysts within Finance can get immediate value from Oracle Data Visualization (DV) and decrease their reliance on overly complex spreadsheets. From its ability to connect to many different kinds of data sources, wrangle multiple data sources into a usable format, and visualize insights that would be otherwise hard to see in a table, Oracle DV provides analysts an extra layer of functionality they can easily learn and use without IT intervention.</p> <hr /> <p class="graf graf--p"><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/seminarslist#Using%20R%20for%20Data%20Profiling" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/seminarslist#Using%20R%20for%20Data%20Profiling"><strong class="markup--strong markup--p-strong">Using R for Data Profiling</strong></a><strong class="markup--strong markup--p-strong"><br /> </strong><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=43" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=43"><strong class="markup--strong markup--p-strong">Michelle Kolbe</strong></a><strong class="markup--strong markup--p-strong"><em class="markup--em markup--p-em">, Red Pill Analytics<br /> </em>When:</strong> June 26 — Monday: Session 3 , 2:00-3:00 pm<br /> <strong class="markup--strong markup--p-strong">Topic:</strong> BI &amp; Reporting — <strong class="markup--strong markup--p-strong">Subtopic:</strong> Other BI and Reporting</p> <p class="graf graf--p">The benefits of knowing your data before embarking on a BI project are endless. Sure, you can buy a tool to help with this, or you could use R, an open-source tool. This session will dig into methods for using R to connect to your data source to see visual and tabular analyses of your data set. You’ll learn how to find missing data, outliers, and unexpected values. If you don’t know R or you are wanting to learn more functions within R, you’ll benefit from this session.</p> <blockquote class="graf graf--blockquote"><p><strong class="markup--strong markup--blockquote-strong">TUESDAY</strong></p></blockquote> <p class="graf graf--p"><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/seminarslist#A%20Lambda%20Architecture%20in%20the%20Cloud:%20A%20Kscope17%20Case%20Study" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/seminarslist#A%20Lambda%20Architecture%20in%20the%20Cloud:%20A%20Kscope17%20Case%20Study"><strong class="markup--strong markup--p-strong">A Lambda Architecture in the Cloud: A Kscope17 Case Study</strong></a><strong class="markup--strong markup--p-strong"><br /> </strong><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=72" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=72"><strong class="markup--strong markup--p-strong">Stewart Bryson</strong></a><strong class="markup--strong markup--p-strong">,</strong> Red Pill Analytics and <a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=69" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=69"><strong class="markup--strong markup--p-strong">Kevin McGinley</strong></a><strong class="markup--strong markup--p-strong">, </strong>Red Pill Analytics<br /> <strong class="markup--strong markup--p-strong">When: </strong>Jun 27, 2017, Tuesday: Session 8, 2:00–3:00 pm<strong class="markup--strong markup--p-strong"><br /> Topic: </strong>Data Visualization &amp; Advanced Analytics <strong class="markup--strong markup--p-strong">Subtopic: </strong>Other</p> <p class="graf graf--p">A Lambda Architecture enables data-driven organizations by simultaneously providing batch and speed processing layers to satisfy the overall appetite for analytics and reporting. But building a Lambda architecture is not easy, usually requiring all of the following: a universal ingestion layer, an immutable data store as a system of record, one or more data processing layers that can satisfy batch and speed requirements, and a serving layer that enables data-driven decision making.</p> <p class="graf graf--p">In this session, we’ll demonstrate how Cloud platforms can supercharge the delivery of a capable Lambda architecture. Our case study will be the IoT data generated by Kscope17 attendees including the beacon from their badges, as well as other devices capturing the results of live polling.</p> <hr /> <p class="graf graf--p"><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/seminarslist#Expanding%20Your%20Data-Driven%20Story:%20The%20Next%20Chapter" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/seminarslist#Expanding%20Your%20Data-Driven%20Story:%20The%20Next%20Chapter"><strong class="markup--strong markup--p-strong">Expanding Your Data-Driven Story: The Next Chapter</strong></a><strong class="markup--strong markup--p-strong"><br /> </strong><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=70" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=70"><strong class="markup--strong markup--p-strong">Mike Durran</strong></a><strong class="markup--strong markup--p-strong"><em class="markup--em markup--p-em">, Oracle Corporation<br /> </em>Co-presenter(s):</strong> <a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=75" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=75"><strong class="markup--strong markup--p-strong">Stewart Bryson</strong></a>, Red Pill Analytics<br /> <strong class="markup--strong markup--p-strong">When:</strong> June 27 — Tuesday: Session 9, 3:30-4:30 pm<br /> <strong class="markup--strong markup--p-strong">Topic:</strong> Data Visualization &amp; Advanced Analytics — <strong class="markup--strong markup--p-strong">Subtopic:</strong> Oracle Data Visualization</p> <p class="graf graf--p">Oracle Data Visualization (DV) makes it easy to get insight from your data. This stunningly visual and intuitive product enables you to access, blend, and wrangle a variety of sources — including spreadsheets, databases, and applications — and tell the story of your data. In this session, learn about the power of data storytelling and the latest capabilities of Oracle DV (including details of product roadmap) to create compelling analytic narratives, including how you can rapidly apply advanced analytic techniques to gain insights previously only accessible to advanced users. Learn about how Oracle DV has been used in real-life scenarios to gain insight and improve business performance.</p> <blockquote class="graf graf--blockquote"><p><strong class="markup--strong markup--blockquote-strong">WEDNESDAY</strong></p></blockquote> <p class="graf graf--p"><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/seminarslist#Oracle%20DV%20for%20the%20Finance%20Analyst%20Hands%20on%20Lab" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/seminarslist#Oracle%20DV%20for%20the%20Finance%20Analyst%20Hands%20on%20Lab"><strong class="markup--strong markup--p-strong">Hands-on Training: Oracle DV for the Finance Analyst</strong></a><strong class="markup--strong markup--p-strong"><br /> </strong><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=69" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=69"><strong class="markup--strong markup--p-strong">Kevin McGinley</strong></a><strong class="markup--strong markup--p-strong">, </strong>Red Pill Analytics and <a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=174" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=174"><strong class="markup--strong markup--p-strong">Tim German</strong></a><strong class="markup--strong markup--p-strong">, </strong>Qubix<br /> <strong class="markup--strong markup--p-strong">When:<br /> </strong>Wednesday, June 28, 2017, 9:45-11:15 AM<br /> Wednesday, June 28, 2017, 1:45-3:15 PM</p> <p class="graf graf--p">This hands-on-lab will build upon the session given by Kevin McGinley and Tim German by allowing attendees to perform some of the demonstrations shown in the session given earlier in the week. Attendees will get to use Oracle Data Visualization against Essbase cubes, Excel spreadsheets, and even learn how to create their own mashups of data to be used for their own analytical purposes. They’ll also learn how building certain types of visualizations and using features like narrative mode can help deepen their analysis and make the communication of their findings easier. Prior attendance of the session is not required to attend the hands-on-lab.</p> <hr /> <p class="graf graf--p"><strong class="markup--strong markup--p-strong">Trends in the World of Analytics, Business Intelligence, and Performance Management Panel Session Moderated by </strong><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=66" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=66"><strong class="markup--strong markup--p-strong">Edward Roske</strong></a><strong class="markup--strong markup--p-strong"> <em class="markup--em markup--p-em">, interRel Consulting<br /> </em></strong><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=72" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=72"><strong class="markup--strong markup--p-strong">Stewart Bryson</strong></a><strong class="markup--strong markup--p-strong">,</strong> Red Pill Analytics<br /> <strong class="markup--strong markup--p-strong">When:</strong> Jun 28, 2017, Wednesday Session 14 , 1:45 pm — 2:45 pm<br /> <strong class="markup--strong markup--p-strong">Topic:</strong> BI &amp; Reporting — <strong class="markup--strong markup--p-strong">Subtopic:</strong> Other BI and Reporting</p> <p class="graf graf--p">There has never been a panel assembled with as many luminaries in the world of BI, EPM, and business analytics as you’ll see on this stage. Each one of these people has over 20 years of experience and collectively, they’ve been involved in more than 1,000 implementations. But they won’t be talking technical tips: with their wealth of experience, they’ll be discussing trends in the bigger world of analytics. Which products are rising up, where are companies investing their money, what new areas are emerging, and much, much more will be discussed as these gurus descend from their metaphorical mountains to discuss and debate for your amusement and education. If you want to know what the reporting, analysis, planning, and consolidation fields are up to, come with plenty of questions and an open mind.</p> <blockquote class="graf graf--blockquote"><p><strong class="markup--strong markup--blockquote-strong">THURSDAY</strong></p></blockquote> <p class="graf graf--p"><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/content/thursday-deep-dive-sessions" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/content/thursday-deep-dive-sessions"><strong class="markup--strong markup--p-strong">Deep Dive Session: Navigating the Oracle Business Analytics Frontier</strong></a><strong class="markup--strong markup--p-strong"><br /> </strong><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=66" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=66"><strong class="markup--strong markup--p-strong">Kevin McGinley</strong></a><strong class="markup--strong markup--p-strong">, </strong>Red Pill Analytics<strong class="markup--strong markup--p-strong"><em class="markup--em markup--p-em"><br /> </em>Co-presenter(s):</strong> <a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=221" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=221"><strong class="markup--strong markup--p-strong">Tracy McMullen</strong></a><strong class="markup--strong markup--p-strong">,</strong> interRel Consulting<br /> <strong class="markup--strong markup--p-strong">When:</strong> June 29 — Deep-Dive Session, 9:00-11:00 am<br /> <strong class="markup--strong markup--p-strong">Topic:</strong> BI &amp; Reporting — <strong class="markup--strong markup--p-strong">Subtopic:</strong> Other BI and Reporting</p> <p class="graf graf--p">Saddle up and dig in your spurs as we trail blaze through Oracle’s Reporting, Business Intelligence, and Data Visualization solutions. Through a rotating panel of experts from Oracle, partners, and customers and interactive discussions with attendees, we’ll navigate Reporting and BI challenges and how Oracle Business Analytics addresses those requirements. Led by moderators Kevin McGinley and Tracy McMullen, the panel will discuss questions such as, “Should I use Smart View or Data Visualization or Oracle Analytics Cloud?”, “How do these solutions work together, and when should I use what?” and, “What are the considerations for moving to the Cloud?” Our panel will share thoughts and perspectives on today’s reporting, BI, and DV questions, climate, and trends. We reckon you won’t want to miss this EPM and BI reporting rodeo in Thursday’s Deep Dive Session.</p> <p class="graf graf--p"> <p class="graf graf--p"><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/content/thursday-deep-dive-sessions" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/content/thursday-deep-dive-sessions"><strong class="markup--strong markup--p-strong">The Great Debate: Where Should My Data Warehouse Live?</strong></a><strong class="markup--strong markup--p-strong"><br /> </strong><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=85" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=85"><strong class="markup--strong markup--p-strong">Michael Rainey</strong></a>, Moderator, Gluent<br /> <strong class="markup--strong markup--p-strong">When:</strong> June 29 — Deep-Dive Session, 9:00–11:00 am<br /> <strong class="markup--strong markup--p-strong">Topic:</strong> Data Warehousing &amp; Big Data — <strong class="markup--strong markup--p-strong">Subtopic:</strong> Data Architecture<br /> Panelists include:<br /> <a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=77" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=77"><strong class="markup--strong markup--p-strong">Stewart Bryson</strong></a><strong class="markup--strong markup--p-strong">,</strong> Red Pill Analytics<br /> <a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=81" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=81"><strong class="markup--strong markup--p-strong">Holger Friedrich</strong></a><strong class="markup--strong markup--p-strong">, </strong>sumIT AG<strong class="markup--strong markup--p-strong"><em class="markup--em markup--p-em"><br /> </em></strong><a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=75" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=75"><strong class="markup--strong markup--p-strong">Antony Heljula</strong></a><strong class="markup--strong markup--p-strong">, </strong>Peak Indicators Ltd<br /> <a class="markup--anchor markup--p-anchor" href="http://kscope17.com/component/seminar/presenterlist?last_id=105" target="_blank" rel="noopener noreferrer" data-href="http://kscope17.com/component/seminar/presenterlist?last_id=105"><strong class="markup--strong markup--p-strong">Kent Graziano</strong></a><strong class="markup--strong markup--p-strong">, </strong>Snowflake Computing</p> <p class="graf graf--p">The long standing debate of running your data warehouse on premises versus in the cloud continues at the KScope17 Big Data and Data Warehousing Thursday Deep Dive session. Whether built in a traditional, relational database or constructed from “schema on read” data in Hadoop, the recent rise of cloud services over the past few years has led data architects, IT directors, and CIO’s to ask the question: “Where should my data warehouse live?” Several experts in the Oracle data warehousing field will provide arguments for their preferred approach, while attempting to refute evidence presented for the alternative solution. Along with what should be a lively and engaging debate, the experts will join together to answer any questions you may have around big data, data warehousing, and data integration in general. Don’t miss this great debate and Q&amp;A session!</p> <hr /> <p class="graf graf--p">Will you be at Kscope17 and want to meet up? <a class="markup--anchor markup--p-anchor" href="mailto:%20lauren@redpillanalytics.com" target="_blank" rel="noopener noreferrer" data-href="mailto: lauren@redpillanalytics.com">Contact us</a> and let’s get a drink!</p> <p class="graf graf--p">We are looking forward to this event and all of the other great opportunies to speak. Make sure to keep an eye on our <a class="markup--anchor markup--p-anchor" href="http://events.redpillanalytics.com" target="_blank" rel="nofollow noopener noreferrer" data-href="http://events.redpillanalytics.com">events</a> page for what Red Pill Analytics is up to this year.</p> Lauren Prezby http://redpillanalytics.com/?p=4974 Wed Jun 14 2017 08:56:14 GMT-0400 (EDT) Oracle BI Cloud Service (BICS) Access Options: When to Use Data Sync Versus Remote Data Connector (RDC) http://blog.performancearchitects.com/wp/2017/06/14/oracle-bi-cloud-service-bics-access-options-when-to-use-data-sync-versus-remote-data-connector-rdc/ <p>Author: Doug Ross, Performance Architects</p> <p><strong>Introduction</strong></p> <p>As more organizations move their business intelligence (BI) environments to the cloud, loading and accessing enterprise data will become as important as the front-end visualizations.  <a href="https://cloud.oracle.com/business_intelligence">Oracle&#8217;s BI Cloud Service (BICS)</a> offers two options for these data requirements that go beyond simple data uploads: <a href="http://www.oracle.com/technetwork/middleware/bicloud/downloads/index.html">Data Sync and Remote Data Connector</a> (RDC).</p> <p>Other options range from simple manual data loads of spreadsheets using the BICS front end to advanced programmatic options based on <a href="https://en.wikipedia.org/wiki/Representational_state_transfer">REST APIs</a>. Each has a specific purpose, features, benefits, and limitations. As the migration to the cloud and the tools used to support that transition are still in their early stages, this blog post discusses the current state of Data Sync and RDC with the expectation that Oracle will continue to enhance the capabilities of each over time.</p> <p><strong>Overview </strong></p> <p>The full list of available BICS data load options includes:</p> <ul> <li><a href="http://www.oracle.com/technetwork/middleware/bicloud/downloads/index.html">Oracle BI Cloud Service Data Sync</a></li> <li><a href="http://www.oracle.com/technetwork/developer-tools/sql-developer/overview/index-097090.html">Oracle SQL Developer</a></li> <li><a href="https://docs.oracle.com/cloud/latest/reportingcs_use/BILPD/GUID-4F98E6CE-F353-4370-9E79-D6DB13225E97.htm#BILPD-GUID-4F98E6CE-F353-4370-9E79-D6DB13225E97">Oracle SQL Workshop Data Upload Utility</a></li> <li><a href="https://docs.oracle.com/en/cloud/paas/exadata-express-cloud/csdbp/using-oracle-application-express-application-data-load-utility.html">Oracle Application Express Application Data Load Utility</a></li> <li>REST APIs</li> <li>PL/SQL scripts</li> </ul> <p>RDC is different from all of the options above in that data is not moved to the cloud; it remains on-premise and is available both to cloud applications like BICS, as well as existing reporting tools in the on-premise environment.</p> <p>At the most basic level, BI Data Sync and RDC represent two ends of the spectrum in providing access to data in the cloud.  Data Sync is used to push data from on-premise sources to a cloud database, while RDC is used to pull data from an on-premise source database into BICS visualizations.</p> <p>Data Sync provides a full-featured data transfer tool with a client interface that allows for scheduling load jobs that efficiently move data from flat files, database tables, and other cloud data sources into the <a href="https://docs.oracle.com/cloud/latest/dbcs_schema/index.html">BICS Database Schema Service</a> or <a href="https://cloud.oracle.com/database">Oracle Database Cloud Service</a>.  It can also directly load data as a data set source for the Visual Analyzer projects that are available in BICS.  It includes many of the features found in other data loading tools: logging of load job execution steps, restarting after failures, incremental loading of new or modified data, and configuring the sequence of load operations.</p> <p>Rather than moving data to the cloud, RDC enables secure connection to on-premise data sources for analysis and visualization.  BICS RDC utilizes the BI Server Data Gateway running in the BI Cloud Service environment to provide secure access to on-premises data using private/public key pairs and SSL communication.  The primary benefit of RDC is that it preserves the investment in the technology used to house and load on-premise data warehouses.  It offers a hybrid approach to transitioning to a cloud-based analytics environment without having to also migrate the entire data environment as well.</p> <p>The decision of whether to use Data Sync versus RDC would be based on a number of factors:</p> <ul> <li>Concerns over data security in the cloud</li> <li>Data volumes in the local data warehouse tables that might be difficult to transfer to the cloud in a timely manner</li> <li>Synchronization of data transmissions to the cloud with current load processes</li> <li>Ongoing investment in ETL tools, processes, and employees that would not be ready to transition completely to the cloud</li> </ul> <p>A Data Sync solution would more likely lend itself either to new development of data load processes or to a more agile analytics environment that allowed for changing processes and data models more rapidly than would be possible with an on-premise database.</p> <p><strong>Conclusion</strong></p> <p>Regardless of where a BI environment is located, it truly is all about the data. And with the push to migrate more of the analytics functions into the cloud, it is necessary to determine the optimal strategy for using data visualization tools to access that data.  Oracle provides many options to do this, whether it is the relatively simple process of configuring access to existing on-premise databases by using RDC or implementing a fully formed data loading strategy into the cloud using BI Data Sync.   The capabilities and tradeoffs for each method should be reviewed thoroughly before proceeding with a cloud-based BI solution.</p> Melanie Mathews http://blog.performancearchitects.com/wp/?p=2029 Wed Jun 14 2017 05:48:40 GMT-0400 (EDT) Big Data Tundra: Creating a Flexible Cloud Based Data Ecosystem http://redpillanalytics.com/flexiblecloudbaseddataecosystem/ <p><img width="300" height="200" src="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/yousif-malibiran-125106.jpg?fit=300%2C200" class="attachment-medium size-medium wp-post-image" alt="Cloud Presentation" srcset="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/yousif-malibiran-125106.jpg?w=1920 1920w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/yousif-malibiran-125106.jpg?resize=300%2C200 300w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/yousif-malibiran-125106.jpg?resize=768%2C512 768w, https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/yousif-malibiran-125106.jpg?resize=1024%2C683 1024w" sizes="(max-width: 300px) 100vw, 300px" data-attachment-id="4967" data-permalink="http://redpillanalytics.com/flexiblecloudbaseddataecosystem/yousif-malibiran-125106/" data-orig-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/yousif-malibiran-125106.jpg?fit=1920%2C1281" data-orig-size="1920,1281" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="Cloud Presentation" data-image-description="&lt;p&gt;Cloud Presentation&lt;/p&gt; " data-medium-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/yousif-malibiran-125106.jpg?fit=300%2C200" data-large-file="https://i1.wp.com/redpillanalytics.com/wp-content/uploads/2017/06/yousif-malibiran-125106.jpg?fit=1024%2C683" /></p><p><span id="ember8242" class="ember-view">Did you miss <a href="https://www.linkedin.com/in/phil-goerdt-773a2319/">Phil Goerdt</a> and </span><a id="ember8245" class="feed-link feed-s-main-content__mention ember-view" tabindex="0" href="https://www.linkedin.com/in/mike-fuller-935823b4/" data-control-name="mention">Mike Fuller</a>&#8216;s <span id="ember8247" class="ember-view">presentation about Cloud computing and storing data at </span><a id="ember8250" class="feed-link feed-s-main-content__mention ember-view" tabindex="0" href="https://www.linkedin.com/company-beta/2625621/" data-control-name="mention">MinneAnalytics</a><span id="ember8252" class="ember-view"> <a id="ember8255" class="hashtag-link ember-view" href="https://www.linkedin.com/search/results/content/?keywords=%23BIgDataTech&amp;origin=HASH_TAG_FROM_FEED" data-control-name="update_hashtag">#BIgDataTech</a> last week? Let us share their presentation with you! Download it <a href="https://www.slideshare.net/PhilGoerdt/big-data-tundra-creating-a-flexible-cloud-based-data-ecosystem?trk=v-feed">here</a>!</span></p> <p>Cloud computing has changed how organizations use, access and store their data. While these paradigms have shifted, the traditional way of thinking about databases and data warehouses remain steadfast in “on-prem” thinking, even in many cloud deployments. Can cloud-native data platforms such as Snowflake coupled with Big Data thinking enable better performance, lower total cost of ownership, and higher data flexibility? This presentation will walk you through a real-world customer story to provide the answer.</p> Lauren Prezby http://redpillanalytics.com/?p=4966 Mon Jun 12 2017 11:38:31 GMT-0400 (EDT) Unify: See Your Data From Every Perspective http://www.rittmanmead.com/blog/2017/06/unify-see-your-data-from-every-perspective/ <img src="http://www.rittmanmead.com/blog/content/images/2017/06/explainexplore3-1.jpg" alt=&qu