saiku通过添加schema和datasource的形式管理对接入系统的数据源,然后提供界面作为直观的分析数据方式,界面产生mdx,由mondrian连接数据源,解析mdx和执行查询

kylin提供大规模数据的olap能力,通过saiku与kylin的对接,利用saiku的友好界面来很方面的查询

关于saiku与kylin的对接,https://github.com/mustangore/kylin-mondrian-interaction githups上这个工程通过修改mondrian添加KylinDialect对添加kylin作为数据源的支持,另外还有有赞的Kylin, Mondrian, Saiku系统的整合,通过以上mondrian的patch,添加kylin的jdbc包后即可在saiku中编写scheam定义cube并查询。

如上的整合,需要手动配置数据源,编写schema的操作,感觉比较繁琐,可以通过修改saiku的代码,到kylin中获取project和cube的各种信息,根据一定规则转换生成schema并作为数据源管理起来,这样就很直接将saiku与kylin无缝对接起来。

话不多说,上代码:

saiku在saiku-beans.xml中定义RepositoryDatasourceManager用于管理已经添加至系统中的datasource,并默认会加在foodmart和earthquaker这两个source,如下

<bean id="repositoryDsManager" class="org.saiku.service.datasource.RepositoryDatasourceManager" init-method="load" destroy-method="unload"><property name="userService" ref="userServiceBean"/><property name="configurationpath" value="${repoconfig}"/><property name="datadir" value="${repodata}"/><property name="foodmartdir" value="${foodmartrepo}"/><property name="foodmartschema" value="${foodmartschema}"/><property name="foodmarturl" value="${foodmarturl}"/><property name="earthquakeDir" value="${earthquakerepo}"/><property name="earthquakeSchema" value="${earthquakeschema}"/><property name="earthquakeUrl" value="${earthquakeurl}"/><property name="repoPasswordProvider" ref ="repoPasswordProviderBean"/><property name="defaultRole" value="${default.role}"/><property name="externalPropertiesFile" value="${external.properties.file}"/><!-- If you change the repoPassword set this property for at least 1 restart to update the old repo password--><!--<property name="oldRepoPassword" value="sa!kuanalyt!cs"/>-->
<span style="white-space:pre"> </span></bean>

我们要自动加载kylin中cube作为数据源,所以需要重新定义一个datasourceManager来帮我们加载和管理kylin中那些数据源

定义 public class KylinResourceDatasourceManager implements IDatasourceManager,saiku中的datasourcemanage会把加入的数据源的信息记录至虚拟文件系统中,所以在我们的datasourcemanager中依样启动虚拟文件系统,启动时,为了避免处理类似cube更新等问题,选择每次启动时全部初始化

 //启动的时候删除所有schema,重新初始化,具体删除代码参考<span style="font-family: Arial, Helvetica, sans-serif;">RepositoryDatasourceManager</span>deleteAllSchema();deleteAllSource();

接下来就是加载kylin中的cube

List<ProjectInstance> projects = null;try {//请求kylin的restful接口,获取所有的projectprojects = restService.httpGet("projects", new TypeReference<List<ProjectInstance>>() {}, null);} catch (Exception e) {}if (projects != null) {//遍历project,获取每一个cubefor (ProjectInstance project : projects) {List<CubeInstance> cubes = getCubes(project.getName());for (CubeInstance cubeInstance : cubes) {String newCubeName = project.getName() + "#" + cubeInstance.getName();//获取到的cube信息存在datasource这个map中datasources.put(newCubeName, getSaikuDatasource(newCubeName));}}}
private SaikuDatasource getSaikuDatasource(String datasourceName) {if (datasourceName.contains("#")) {String cubeName = datasourceName.split("#")[1].trim();CubeDesc[] cubeDescs = null;try {
<span style="white-space:pre">     </span>//cube的具体信息获取cubeDescs = restService.httpGet("cube_desc/" + cubeName, new TypeReference<CubeDesc[]>() {}, null);} catch (Exception e) {e.printStackTrace();}List<DataModelDesc> modelDescs = null;try {
<span style="white-space:pre">     </span>//cube对应的model信息modelDescs = restService.httpGet("models", new TypeReference<List<DataModelDesc>>() {}, new BasicNameValuePair("modelName", cubeDescs[0].getModelName()), new BasicNameValuePair("projectName", datasourceName.split("#")[0].trim()));} catch (Exception e) {e.printStackTrace();}if (cubeDescs != null && cubeDescs.length == 1 && modelDescs != null) {try {
<span style="white-space:pre">     </span>//生成schema信息并添加到虚拟文件系统中addSchema(SchemaUtil.createSchema(datasourceName, cubeDescs[0], modelDescs.get(0)), "/datasources/" + datasourceName.replace("#", ".") + ".xml", datasourceName);} catch (Exception e) {e.printStackTrace();}String project = new String();if (datasourceName.contains("#")) {project = datasourceName.split("#")[0].trim();} elseproject = datasourceName;Properties properties = new Properties();
<span style="white-space:pre">     </span>//传给mondrian的datasouce相关信息properties.put("location", "jdbc:mondrian:Jdbc=jdbc:kylin://" + kylinUrl + "/" + project + ";JdbcDrivers=" + KYLINE_DRIVER + ";Catalog=mondrian:///datasources/" + datasourceName.replace("#", ".") + ".xml");properties.put("driver", "mondrian.olap4j.MondrianOlap4jDriver");properties.put("username", userName);properties.put("password", passWord);properties.put("security.enabled", false);properties.put("advanced", false);return new SaikuDatasource(cubeName, SaikuDatasource.Type.OLAP, properties);}}return null;}
public class SchemaUtil {private static String newLine = "\r\n";public static String createSchema(String dataSourceName, CubeDesc cubeDesc, DataModelDesc modelDesc) {StringBuffer sb = new StringBuffer();sb = appendSchema(sb, dataSourceName, cubeDesc, modelDesc);
//        System.out.println("********************************" + sb.toString());return sb.toString();}public static StringBuffer appendSchema(StringBuffer sb, String dataSourceName, CubeDesc cubeDesc, DataModelDesc modelDesc) {sb.append("<?xml version='1.0'?>").append(newLine).append("<Schema name='" + dataSourceName.split("#")[0].trim() + "' metamodelVersion='4.0'>")
//                .append("<Schema name='" + dataSourceName + "' metamodelVersion='4.0'>").append(newLine);sb = appendTable(sb, cubeDesc.getDimensions());sb = appendDimension(sb, cubeDesc.getDimensions(), modelDesc);sb = appendCube(sb, dataSourceName, cubeDesc, modelDesc);sb.append("</Schema>").append(newLine);return sb;}public static StringBuffer appendTable(StringBuffer sb, List<DimensionDesc> dimensionDescList) {
//        Map<String, List<String>> table2Column = new HashMap<String, List<String>>();Set<String> tables = getTables(dimensionDescList);sb.append("<PhysicalSchema>").append(newLine);for (String key : tables) {sb.append("<Table name='" + key + "'/>").append(newLine);}sb.append("</PhysicalSchema>").append(newLine);return sb;}public static Map<String, JoinDesc> getJoinDesc(DataModelDesc modelDesc){Map<String, JoinDesc> joinDescMap = new HashMap<String, JoinDesc>();for(LookupDesc lookupDesc : modelDesc.getLookups()){if(!joinDescMap.containsKey(dealTableName(lookupDesc.getTable())))joinDescMap.put(dealTableName(lookupDesc.getTable()), lookupDesc.getJoin());}return joinDescMap;}public static StringBuffer appendDimension(StringBuffer sb, List<DimensionDesc> dimensionDescList, DataModelDesc modelDesc) {StringBuffer hierSb = new StringBuffer();for (DimensionDesc dimensionDesc : dimensionDescList) {String table = dealTableName(dimensionDesc.getTable());Map<String, JoinDesc> joinDescMap = getJoinDesc(modelDesc);Set<String> columns = getColumns(dimensionDesc);if(joinDescMap.containsKey(table))sb.append("<Dimension name='" + dimensionDesc.getName() + "' key='" + joinDescMap.get(table).getPrimaryKey()[0] + "' table='" + table  + "'>").append(newLine);elsesb.append("<Dimension name='" + dimensionDesc.getName() + "' key='" + columns.iterator().next() + "' table='" + table  + "'>").append(newLine);hierSb.append("<Hierarchies>").append(newLine);sb.append("<Attributes>").append(newLine);for(String column : columns){// add Attributes to stringbuffersb = addAttribute(sb, column);if(joinDescMap.containsKey(table)){int index = getForeignKeyIndex(column,joinDescMap.get(table));if(index != -1)hierSb = addJoinHierarchy(hierSb, index, table, joinDescMap.get(table), dealTableName(modelDesc.getFactTable()));else// add Hierarchy to stringbufferhierSb = addHierarchy(hierSb, column);}else// add Hierarchy to stringbufferhierSb = addHierarchy(hierSb, column);}if(joinDescMap.containsKey(table)){for(String primaryKey : joinDescMap.get(table).getPrimaryKey()){if(!columns.contains(primaryKey))sb = addAttribute(sb, primaryKey);}}sb.append("</Attributes>").append(newLine);hierSb.append("</Hierarchies>").append(newLine);sb.append(hierSb);hierSb.delete(0, hierSb.length());sb.append("</Dimension>").append(newLine);}return sb;}public static Set<String> getColumns(DimensionDesc dimensionDesc){Set<String> columns = new HashSet<String>();if (dimensionDesc.getColumn() != null || dimensionDesc.getDerived() != null) {if(dimensionDesc.getColumn() != null) {
//                for (String column : dimensionDesc.getColumn()) {columns.add(dimensionDesc.getColumn());
//                }}if (dimensionDesc.getDerived() != null) {for (String derived : dimensionDesc.getDerived()) {columns.add(derived);}}} else {columns.add(dimensionDesc.getName());}return columns;}public static StringBuffer addAttribute(StringBuffer sb, String attr) {sb.append("<Attribute hasHierarchy='false' levelType='Regular' name='" + attr + "'>").append(newLine).append("<Key>").append(newLine).append("<Column name='" + attr + "'/>").append(newLine).append("</Key>").append(newLine).append("</Attribute>").append(newLine);return sb;}public static StringBuffer addHierarchy(StringBuffer sb, String attr) {sb.append("<Hierarchy  name='" + attr + "' hasAll='true'>").append(newLine).append("<Level attribute='" + attr + "'/>").append(newLine).append("</Hierarchy>").append(newLine);return sb;}public static StringBuffer addJoinHierarchy(StringBuffer sb, int index, String table, JoinDesc joinDesc, String factTable){sb.append("<Hierarchy hasAll='true' name='"+ joinDesc.getPrimaryKey()[index] +"'>").append(newLine).append("<Join leftKey='"+ joinDesc.getPrimaryKey()[index] +"' rightKey='"+  joinDesc.getForeignKey()[index]   +"'>").append(newLine).append("<Table name='"+ factTable +"'/>").append(newLine).append("<Table name='"+ table +"'/>").append(newLine).append("<RelationOrJoin type='"+ joinDesc.getType() +"' />").append(newLine).append("</Join>").append(newLine).append("<Level attribute='" + joinDesc.getForeignKey()[index] + "'/>").append(newLine).append("</Hierarchy>").append(newLine);return sb;}public static int getForeignKeyIndex(String attr, JoinDesc joinDesc){for(int i=0; i<joinDesc.getPrimaryKey().length; i++){if(joinDesc.getPrimaryKey()[i].equals(attr)){return i;}}return -1;}public static String dealTableName(String tableName){if(tableName.contains("."))return tableName.split("\\.")[1];elsereturn tableName;}public static StringBuffer appendCube(StringBuffer sb, String cubeName, CubeDesc cubeDesc, DataModelDesc modelDesc) {sb.append("<Cube name='" + cubeName.split("#")[1].trim() + "'>").append(newLine);sb = addCubeDimension(sb, cubeDesc.getDimensions());sb.append("<MeasureGroups>").append(newLine);
//        Set<String> tables = getTables(cubeDesc.getDimensions());
//        for(String table : tables) {sb.append("<MeasureGroup table='" + dealTableName(modelDesc.getFactTable()) + "'>").append(newLine);sb = addDimensionLink(sb, cubeDesc.getDimensions(), modelDesc);
//            if(table.equals(modelDesc.getFactTable().trim())) {sb.append("<Measures>").append(newLine);for (MeasureDesc measureDesc : cubeDesc.getMeasures()) {//            sb.append("<MeasureGroup>").append(newLine);sb = addMeasure(sb, measureDesc, getColumn(cubeDesc));}sb.append("</Measures>").append(newLine);
//            }sb.append("</MeasureGroup>").append(newLine);
//        }sb.append("</MeasureGroups>").append(newLine);sb.append("</Cube>").append(newLine);return sb;}public static StringBuffer addCubeDimension(StringBuffer sb, List<DimensionDesc> dimensionDescs) {sb.append("<Dimensions>").append(newLine);for (DimensionDesc dimensionDesc : dimensionDescs) {sb.append("<Dimension source='" + dimensionDesc.getName() + "' visible='true'/>").append(newLine);}sb.append("</Dimensions>").append(newLine);return sb;}public static StringBuffer addDimensionLink(StringBuffer sb, List<DimensionDesc> dimensionDescs, DataModelDesc modelDesc){sb.append("<DimensionLinks>" ).append(newLine);for(DimensionDesc dimensionDesc : dimensionDescs) {if(dimensionDesc.getTable().contains(modelDesc.getFactTable())) {sb.append("<FactLink dimension='" + dimensionDesc.getName() + "'/>").append(newLine);}else{LookupDesc[] lookupDescs = modelDesc.getLookups();for(LookupDesc lookupDesc : lookupDescs){if(dimensionDesc.getTable().contains(lookupDesc.getTable())){for(String primaryKey : lookupDesc.getJoin().getPrimaryKey())sb.append("<ForeignKeyLink dimension='" + dimensionDesc.getName() + "' foreignKeyColumn='"+  primaryKey +"'/>").append(newLine);}}}}sb.append(" </DimensionLinks>").append(newLine);return sb;}public static StringBuffer addMeasure(StringBuffer sb, MeasureDesc measureDesc, String defaultColumn) {FunctionDesc funtionDesc = measureDesc.getFunction();String aggregator = funtionDesc.getExpression().trim().toLowerCase();//mondrian only have distinct-countif(aggregator.equals("count_distinct")){aggregator = "distinct-count";}if(funtionDesc.getParameter().getValue().equals("1")) {sb.append("<Measure aggregator='" + aggregator + "' column='" + defaultColumn + "' name='" + measureDesc.getName() + "' visible='true'/>").append(newLine);}elsesb.append("<Measure aggregator='" + aggregator + "' column='" + funtionDesc.getParameter().getValue() + "' name='" + measureDesc.getName() + "' visible='true'/>").append(newLine);return sb;}public static Set<String> getTables(List<DimensionDesc> dimensionDescList){Set<String> tables = new HashSet<String>();for (DimensionDesc dimensionDesc : dimensionDescList) {String table = dealTableName(dimensionDesc.getTable());if (!tables.contains(table)) {tables.add(table);}}return tables;}public static String getColumn(CubeDesc cubeDesc){RowKeyDesc rowKey = cubeDesc.getRowkey();return rowKey.getRowKeyColumns()[0].getColumn();}}

然后就是初始化的时候另起线程,检查是否有新添加的cube,有的话加入

if (projects != null) {for (ProjectInstance project : projects) {List<CubeInstance> cubes = getCubes(project.getName());for (CubeInstance cubeInstance : cubes) {String newCubeName = project.getName() + "#" + cubeInstance.getName();if (!datasources.containsKey(newCubeName))datasources.put(newCubeName, getSaikuDatasource(newCubeName));}}}

这样就可以通过saiku愉快的查询kylin的cube了

相关工程:

saiku 3.8.8  https://github.com/OSBI/saiku

mondrian 4.4 https://github.com/pentaho/mondrian

kylin 1.5.3 https://github.com/apache/kylin

saiku无缝对接kylin相关推荐

  1. 无缝衔接的人会遭报应吗_还为幼小衔接发愁吗?学会4招,孩子养成好习惯,与小学无缝对接...

    原标题:还为幼小衔接发愁吗?学会4招,孩子养成好习惯,与小学无缝对接 文/ 晓梅妈妈聊育儿 转眼又到毕业季,很多小朋友都即将结束自己的幼儿园生活,开始进入小学,进行系统的学校教育. 如何做好幼小衔接的 ...

  2. SaaS新模式:业务、财务与支付无缝对接

    本文讲的是SaaS新模式:业务.财务与支付无缝对接,[IT168 资讯]SaaS.云计算一直是业界十分关注的关键词,尤其SaaS厂商今年更是动作频繁,表现得十分热闹.SaaS企业如何盈利,如何做好服务 ...

  3. 阿里深度学习框架开源了!无缝对接TensorFlow、PyTorch

    阿里巴巴内部透露将开源内部深度学习框架 X-DeepLearning的计划,这是业界首个面向广告.推荐.搜索等高维稀疏数据场景的深度学习开源框架,可以与TensorFlow.PyTorch 和 MXN ...

  4. 航天金税 接口_用友凭证接口可以实现数据之间无缝对接和打通

    用友凭证接口可以实现数据之间无缝对接和打通 用友凭证接口支持数据源多样性: 凭证接口可以支持多种数据源.多种系统做接口集成,其中以医院HIS系统.航天金税系统.OA系统.其他软件业务系统等已经成功对接 ...

  5. 物理服务器向虚拟化无缝对接,服务器虚拟化下的网络变迁

    一个风起"云"涌的IT时代,展现的是一种全新的动态IT基础设施.和传统的IT基础设施相比,虚拟化成为目前整个IT基础架构的变革性创新技术,对计算.存储.网络都产生了长远的影响. 在 ...

  6. iDowns-v1.8.3 无缝对接erphpdown会员中心+在线充值+VIP开通+卡密插件

    介绍: iDownsV1.8.3是一款非常强大的下载类主题,主题完美实现了站长们的苛刻需求以及用户的体验,主题无缝对接erphpdown会员中心插件实现强大的前台会员用户中心以及后台强大的管理系统,前 ...

  7. 网络01:双无线路由器无缝对接设置

    如果家太大或者墙太多,无线路由器总会有覆盖不到的地方.一般有如下解决方法. 再加一个WiFi,利用两个WiFi的WDS(Wireless Distribution System),进行中继或桥接.优点 ...

  8. onvif/rtsp转gb28181协议,无缝对接国标平台

    文章目录 前言 一.onvif搜索 二.sip注册等一系列功能 三.效果展示 总结 前言 现在市面上很多ipc已经支持gb28181,当然,也有极个别的不支持.针对那些已经安装部署的老式摄像机,不支持 ...

  9. 【美萍超市管理系统】汉码盘点机无缝对接 金蝶盘点机条码数据采集器智能终端PDA

    汉码盘点机无缝对接[美萍超市管理系统].通过WIFI,3G,4G网络与总部实时数据交互.在盘点机上扫描条码,即可自动生成后台数据库服务器中的各种单据.采购入库,采购退货,销售出库,销售退货,仓库盘点, ...

最新文章

  1. 深入理解为什么MySQL全表扫描很慢?
  2. 一前端去相亲网站找对象,朋友问:找到了吗?这回复太专业...
  3. python手机版打了代码运行不了-如何用iPad运行Python代码?
  4. JS 获取 鼠标 坐标
  5. 《Hadoop权威指南》第二章 关于MapReduce
  6. java 线程 Thread Runnable 实现样例
  7. 嵌入式中常见的存储器总结(二)SRAM VS DRAM
  8. 投简历没回音?你没写到点子上,HR当然不看
  9. CF125E MST Company
  10. 形式语言与自动机 第五章 课后题答案
  11. 数据中台和数仓的关系
  12. shibor与沪深300指数的相关性图示
  13. Defaulting to user installation because normal site-packages is not writeable
  14. Win 10 下无法安装.net framework 3.5,错误代码0x800F081F的解决方案
  15. 家用数码相机选购及使用指南
  16. 微信小程序提示:https://api.map.baidu.com 不在以下 request 合法域名列表中
  17. 写一副对子_一副对子的传奇故事
  18. JavaWeb项目实战 第四部分 Linux
  19. message broker
  20. 【GamePlay】泡泡龙核心算法

热门文章

  1. 双象空间前方交会代码_基于联觉效应的VR交互式图形与图像混合建模
  2. Android开发丶集成微信原生登录
  3. PLC编程基本功:梯形图与控制线路
  4. matlab射频传输距离计算,VHF和UHF信号的传播距离计算工具
  5. 分析响应时间ns级别的TVS管个中奥秘
  6. Java实现复杂excel格式导出
  7. 如何最大限度地提高您的记忆力
  8. D题 走迷宫
  9. powershell获取linux文件,技术|微软爱上 Linux:当 PowerShell 来到 Linux 时
  10. 实体商家也能玩转月活10亿的微信小程序生态