服务器之家:专注于VPS、云服务器配置技术及软件下载分享
分类导航

Mysql|Sql Server|Oracle|Redis|MongoDB|PostgreSQL|Sqlite|DB2|mariadb|Access|数据库技术|

服务器之家 - 数据库 - Sql Server - SQLServer高效解析JSON格式数据的实例过程

SQLServer高效解析JSON格式数据的实例过程

2022-08-29 13:52飞虹147 Sql Server

Sql Server从2016开始支持了一些json操作,最近的项目里也是好多地方直接用字段直接存成了json,下面这篇文章主要给大家介绍了关于SQLServer高效解析JSON格式数据的相关资料,需要的朋友可以参考下

1. 背景

最近碰到个需求,源数据存在posgtreSQL中,且为JSON格式。那如果在SQLServer中则 无法直接使用,需要先解析成表格行列结构化存储,再复用。

样例数据如下

‘[{“key”:“2019-01-01”,“value”:“4500.0”},{“key”:“2019-01-02”,“value”:“4500.0”},{“key”:“2019-01-03”,“value”:“4500.0”},{“key”:“2019-01-04”,“value”:“4500.0”},{“key”:“2019-01-05”,“value”:“4500.0”},{“key”:“2019-01-06”,“value”:“4500.0”},{“key”:“2019-01-07”,“value”:“4500.0”},{“key”:“2019-01-08”,“value”:“4500.0”},{“key”:“2019-01-09”,“value”:“4500.0”},{“key”:“2019-01-10”,“value”:“4500.0”},{“key”:“2019-01-11”,“value”:“4500.0”},{“key”:“2019-01-12”,“value”:“4500.0”},{“key”:“2019-01-13”,“value”:“4500.0”},{“key”:“2019-01-14”,“value”:“4500.0”},{“key”:“2019-01-15”,“value”:“4500.0”},{“key”:“2019-01-16”,“value”:“4500.0”},{“key”:“2019-01-17”,“value”:“4500.0”},{“key”:“2019-01-18”,“value”:“4500.0”},{“key”:“2019-01-19”,“value”:“4500.0”},{“key”:“2019-01-20”,“value”:“4500.0”},{“key”:“2019-01-21”,“value”:“4500.0”},{“key”:“2019-01-22”,“value”:“4500.0”},{“key”:“2019-01-23”,“value”:“4500.0”},{“key”:“2019-01-24”,“value”:“4500.0”},{“key”:“2019-01-25”,“value”:“4500.0”},{“key”:“2019-01-26”,“value”:“4500.0”},{“key”:“2019-01-27”,“value”:“4500.0”},{“key”:“2019-01-28”,“value”:“4500.0”},{“key”:“2019-01-29”,“value”:“4500.0”},{“key”:“2019-01-30”,“value”:“4500.0”},{“key”:“2019-01-31”,“value”:“4500.0”}]’

研究了下方法,可以先将 JSON串 拆成独立的 key-value对,再来对key-value子串做截取,获取两列数据值。

2. 拆串-拆分JSON串至key-value子串

这里主要利用行号和分隔符来组合完成拆分的功能。
参考如下样例。
主要利用连续数值作为索引(起始值为1),从源字符串每个位置截取长度为1(分隔符的长度)的字符,如果为分隔符,则为有效的、待处理的记录。有点类似于生物DNA检测中的鸟枪法,先广撒网,再根据标记识别、追踪。

?
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
/*
 * Date   : 2020-07-01
 * Author : 飞虹
 * Sample : 拆分 指定分割符的字符串为单列多值
 * Input  : 字符串'jun,cong,haha'
 * Output : 列,值为 'jun', 'cong', 'haha'
 */
declare @s nvarchar(500) = 'jun,cong,haha'
            ,@sep nvarchar(5) = ',';
with cte_Num as (
    select 1 as n
    union all
    select n+1 n from cte_Num where n<100
)
select d.s, a.n
          ,n-len(replace(left(s, n), @sep, '')) + 1 as pos,
          CHARINDEX(@sep, s+@sep, n),
          substring(s, n, CHARINDEX(@sep, s+@sep, n)-n) as element
from (select @s as s) as d
 join cte_Num a
 on
     n<=len(s) and
 substring(@sep+s, n, 1) = @sep

3. 取值-创建函数截取key-value串的值

基于第2步的结果,可以将JSON长串拆分为 key-value字符串,如 “2020-01-01”:“98.99”。到这一步,就好办了。既可以自己写表值函数来返回结果,也可以直接通过substring来截取。这里开发一个表值函数,来进行封装。

?
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
/*
 *******************************************************************************
 *     Date : 2020-07-01
 *   Author : 飞虹
 *     Note : 利用patindex正则匹配字符,在while中对字符进行逐个匹配、替换为空。
 * Function : getDateAmt
 *   Input  : key-value字符串,如 "2020-01-01":"98.99"
 *   Output : Table类型(日期列,数值列)。值为 2020-01-01, 98.99
 *******************************************************************************
*/
CREATE FUNCTION dbo.getDateAmt(@S VARCHAR(100))
RETURNS   @tb_rs table(dt date, amt decimal(28,14))
AS
BEGIN
    WHILE PATINDEX('%[^0-9,-.]%',@S) > 0
        BEGIN
            -- 匹配:去除非数字 、顿号、横线 的字符
            set @s=stuff(@s,patindex('%[^0-9,-.]%',@s),1,'')
        END
        insert into @tb_rs
           select SUBSTRING(@s,1,charindex(',',@s)-1)
                , substring(@s,charindex(',',@s)+1, len(@s) )
       return
 END
GO
 
--测试
select  * from DBO.getDateAmt('{"key":"2019-01-01","value":"4500.0"')

4. 完整样例

附上完整脚本样例,全程CTE,直接查询,预览效果。

?
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
;with cte_t1 as (
            select * from
            ( values('jun','[{"key":"2019-01-01","value":"4500.0"},{"key":"2019-01-02","value":"4500.0"},{"key":"2019-01-03","value":"4500.0"},{"key":"2019-01-04","value":"4500.0"},{"key":"2019-01-05","value":"4500.0"},{"key":"2019-01-06","value":"4500.0"},{"key":"2019-01-07","value":"4500.0"},{"key":"2019-01-08","value":"4500.0"},{"key":"2019-01-09","value":"4500.0"},{"key":"2019-01-10","value":"4500.0"},{"key":"2019-01-11","value":"4500.0"},{"key":"2019-01-12","value":"4500.0"},{"key":"2019-01-13","value":"4500.0"},{"key":"2019-01-14","value":"4500.0"},{"key":"2019-01-15","value":"4500.0"},{"key":"2019-01-16","value":"4500.0"},{"key":"2019-01-17","value":"4500.0"},{"key":"2019-01-18","value":"4500.0"},{"key":"2019-01-19","value":"4500.0"},{"key":"2019-01-20","value":"4500.0"},{"key":"2019-01-21","value":"4500.0"},{"key":"2019-01-22","value":"4500.0"},{"key":"2019-01-23","value":"4500.0"},{"key":"2019-01-24","value":"4500.0"},{"key":"2019-01-25","value":"4500.0"},{"key":"2019-01-26","value":"4500.0"},{"key":"2019-01-27","value":"4500.0"},{"key":"2019-01-28","value":"4500.0"},{"key":"2019-01-29","value":"4500.0"},{"key":"2019-01-30","value":"4500.0"},{"key":"2019-01-31","value":"4500.0"}]')
                   ,('congc','[{"key":"2019-01-01","value":"347.82608695652175"},{"key":"2019-01-02","value":"347.82608695652175"},{"key":"2019-01-03","value":"347.82608695652175"},{"key":"2019-01-04","value":"347.82608695652175"},{"key":"2019-01-07","value":"347.82608695652175"},{"key":"2019-01-08","value":"347.82608695652175"},{"key":"2019-01-09","value":"347.82608695652175"},{"key":"2019-01-10","value":"347.82608695652175"},{"key":"2019-01-11","value":"347.82608695652175"},{"key":"2019-01-14","value":"347.82608695652175"},{"key":"2019-01-15","value":"347.82608695652175"},{"key":"2019-01-16","value":"347.82608695652175"},{"key":"2019-01-17","value":"347.82608695652175"},{"key":"2019-01-18","value":"347.82608695652175"},{"key":"2019-01-21","value":"347.82608695652175"},{"key":"2019-01-22","value":"347.82608695652175"},{"key":"2019-01-23","value":"347.82608695652175"},{"key":"2019-01-24","value":"347.82608695652175"},{"key":"2019-01-25","value":"347.82608695652175"},{"key":"2019-01-28","value":"347.82608695652175"},{"key":"2019-01-29","value":"347.82608695652175"},{"key":"2019-01-30","value":"347.82608695652175"},{"key":"2019-01-31","value":"347.82608695652175"}]')
            ) as t(name, jsonStr)
)   , cte_rn as (
                select 1 as rn
                union all
                select rn+1 from cte_rn where rn < 1000
    
    , cte_splitJson as (
                SELECT  a.name
                              ,replace(replace(a.jsonStr,'[',''),']','') as jsonStr
                              ,substring(replace(replace(a.jsonStr,'[',''),']','')
                                            , b1.rn
                                            , charindex('},', replace(replace(a.jsonStr,'[',''),']','')+'},', b1.rn)-b1.rn ) as value_json
                from cte_t1 a
                    cross join cte_rn b1
                where  substring('},'+replace(replace(a.jsonStr,'[',''),']',''), rn, 2) = '},'
    )
    select *
    from cte_splitJson a
        cross apply dbo.getDateAmt(a.value_json) as t1
    -- 注意这里生成行号时, 需要设置默认递归次数
    option(maxrecursion 0)

5. 问题

经过在个人普通配置PC实测,性能有点堪忧,耗时:数据量 约为15mins:50W ,不太能接受。有兴趣或者经历过的伙伴,出手来协助, 怎么提高效率,或者来个新方案?

到此这篇关于SQLServer高效解析JSON格式数据的文章就介绍到这了,更多相关SQLServer解析JSON数据内容请搜索服务器之家以前的文章或继续浏览下面的相关文章希望大家以后多多支持服务器之家!

原文链接:https://blog.csdn.net/zjsearching/article/details/107068419

延伸 · 阅读

精彩推荐