您现在的位置是:网站首页> 编程资料编程资料

oracle数据库去除重复数据常用的方法总结_oracle_

2023-05-27 544人已围观

简介 oracle数据库去除重复数据常用的方法总结_oracle_

创建测试数据

create table nayi224_180824(col_1 varchar2(10), col_2 varchar2(10), col_3 varchar2(10)); insert into nayi224_180824 select 1, 2, 3 from dual union all select 1, 2, 3 from dual union all select 5, 2, 3 from dual union all select 10, 20, 30 from dual ; commit; select*from nayi224_180824;
COL_1COL_2COL_3
123
123
523
102030

针对指定列,查出去重后的结果集

distinct

select distinct t1.* from nayi224_180824 t1; 
COL_1COL_2COL_3
102030
123
523

方法局限性很大,因为它只能对全部查询的列做去重。如果我想对col_2,col3去重,那我的结果集中就只能有col_2,col_3列,而不能有col_1列。

select distinct t1.col_2, col_3 from nayi224_180824 t1 
COL_2COL_3
23
2030

不过它也是最简单易懂的写法。

row_number()

select * from (select t1.*, row_number() over(partition by t1.col_2, t1.col_3 order by 1) rn from nayi224_180824 t1) t1 where t1.rn = 1 ; 
COL_1COL_2COL_3RN
1231
1020301

写法上要麻烦不少,但是有更大的灵活性。

针对指定列,查出所有重复的行

count having

select * from nayi224_180824 t where (t.col_2, t.col_3) in (select t1.col_2, t1.col_3 from nayi224_180824 t1 group by t1.col_2, t1.col_3 having count(1) > 1) 
COL_1COL_2COL_3
123
123
523

要查两次表,效率会比较低。不推荐。

count over

select * from (select t1.*, count(1) over(partition by t1.col_2, t1.col_3) rn from nayi224_180824 t1) t1 where t1.rn > 1 ; 
COL_1COL_2COL_3RN
1233
1233
5233

只需要查一次表,推荐。

删除所有重复的行

delete from nayi224_180824 t where t.rowid in ( select rid from (select t1.rowid rid, count(1) over(partition by t1.col_2, t1.col_3) rn from nayi224_180824 t1) t1 where t1.rn > 1); 

就是上面的语句稍作修改。

删除重复数据并保留一条

分析函数法

delete from nayi224_180824 t where t.rowid in (select rid from (select t1.rowid rid, row_number() over(partition by t1.col_2, t1.col_3 order by 1) rn from nayi224_180824 t1) t1 where t1.rn > 1); 

拥有分析函数一贯的灵活性高的特点。可以为所欲为的分组,并通过改变orderby从句来达到像”保留最大id“这样的要求。

group by

delete from nayi224_180824 t where t.rowid not in (select max(rowid) from nayi224_180824 t1 group by t1.col_2, t1.col_3); 

牺牲了一部分灵活性,换来了更高的效率。

总结

到此这篇关于oracle数据库去除重复数据常用的文章就介绍到这了,更多相关oracle去除重复数据内容请搜索以前的文章或继续浏览下面的相关文章希望大家以后多多支持!

-六神源码网