如何在hbase中批量删除多行?

2022-09-01 21:12:56

我在hbase表“mytable”中有以下带有这些键的行

user_1
user_2
user_3
...
user_9999999

我想使用Hbase shell从中删除行:

user_500到user_900

我知道没有办法删除,但是有没有办法使用“批量删除处理器”来执行此操作?

我在这里看到:

https://github.com/apache/hbase/blob/master/hbase-examples/src/test/java/org/apache/hadoop/hbase/coprocessor/example/TestBulkDeleteProtocol.java

我只想粘贴导入,然后将其粘贴到shell中,但不知道如何进行此操作。有谁知道我如何从jruby hbase shell使用这个端点?

   Table ht = TEST_UTIL.getConnection().getTable("my_table");
    long noOfDeletedRows = 0L;
    Batch.Call<BulkDeleteService, BulkDeleteResponse> callable =
      new Batch.Call<BulkDeleteService, BulkDeleteResponse>() {
      ServerRpcController controller = new ServerRpcController();
      BlockingRpcCallback<BulkDeleteResponse> rpcCallback =
        new BlockingRpcCallback<BulkDeleteResponse>();

      public BulkDeleteResponse call(BulkDeleteService service) throws IOException {
        Builder builder = BulkDeleteRequest.newBuilder();
        builder.setScan(ProtobufUtil.toScan(scan));
        builder.setDeleteType(deleteType);
        builder.setRowBatchSize(rowBatchSize);
        if (timeStamp != null) {
          builder.setTimestamp(timeStamp);
        }
        service.delete(controller, builder.build(), rpcCallback);
        return rpcCallback.get();
      }
    };
    Map<byte[], BulkDeleteResponse> result = ht.coprocessorService(BulkDeleteService.class, scan
        .getStartRow(), scan.getStopRow(), callable);
    for (BulkDeleteResponse response : result.values()) {
      noOfDeletedRows += response.getRowsDeleted();
    }
    ht.close();

如果没有办法通过JRuby做到这一点,Java或替代方法来快速删除多行是可以的。


答案 1

你真的想在shell中做到这一点,因为还有其他各种更好的方法。一种方法是使用本机java API

  • 构造删除的数组列表
  • 将此数组列表传递给 Table.delete 方法

方法1:如果您已经知道密钥的范围。

public void massDelete(byte[] tableName) throws IOException {
    HTable table=(HTable)hbasePool.getTable(tableName);

    String tablePrefix = "user_";
    int startRange = 500;
    int endRange = 999;

    List<Delete> listOfBatchDelete = new ArrayList<Delete>();

    for(int i=startRange;i<=endRange;i++){
        String key = tablePrefix+i; 
        Delete d=new Delete(Bytes.toBytes(key));
        listOfBatchDelete.add(d);  
    }

    try {
        table.delete(listOfBatchDelete);
    } finally {
        if (hbasePool != null && table != null) {
            hbasePool.putTable(table);
        }
    }
}

方法2:如果要根据扫描结果执行批量删除。

public bulkDelete(final HTable table) throws IOException {
    Scan s=new Scan();
    List<Delete> listOfBatchDelete = new ArrayList<Delete>();
    //add your filters to the scanner
    s.addFilter();
    ResultScanner scanner=table.getScanner(s);
    for (Result rr : scanner) {
        Delete d=new Delete(rr.getRow());
        listOfBatchDelete.add(d);
    }
    try {
        table.delete(listOfBatchDelete);
    } catch (Exception e) {
        LOGGER.log(e);

    }
}

现在归结为使用协处理器。只有一个建议,“不要使用协处理器”,除非你是HBase的专家。协处理器有许多内置问题,如果需要,我可以为您提供详细的描述。其次,当您从HBase中删除任何内容时,它永远不会直接从Hbase中删除,并且有逻辑删除标记附加到该记录,稍后在重大压缩期间将其删除,因此无需使用资源非常详尽的协处理器。

修改代码以支持批处理操作。

int batchSize = 50;
int batchCounter=0;
for(int i=startRange;i<=endRange;i++){

String key = tablePrefix+i;
Delete d=new Delete(Bytes.toBytes(key));
listOfBatchDelete.add(d);  
batchCounter++;

if(batchCounter==batchSize){
    try {
        table.delete(listOfBatchDelete);
        listOfBatchDelete.clear();
        batchCounter=0;
    }
}}

创建 HBase conf 并获取表实例。

Configuration hConf = HBaseConfiguration.create(conf);
hConf.set("hbase.zookeeper.quorum", "Zookeeper IP");
hConf.set("hbase.zookeeper.property.clientPort", ZookeeperPort);

HTable hTable = new HTable(hConf, tableName);

答案 2

如果您已经知道要从 HBase 表中删除的记录的行键,则可以使用以下方法

1.首先使用这些行键创建一个列表对象

for (int rowKey = 1; rowKey <= 10; rowKey++) {
    deleteList.add(new Delete(Bytes.toBytes(rowKey + "")));
}

2.然后使用 HBase 连接获取 Table 对象

Table table = connection.getTable(TableName.valueOf(tableName));

3.一旦你有表对象调用delete()通过传递列表

table.delete(deleteList);

完整的代码将如下所示

Configuration config = HBaseConfiguration.create();
config.addResource(new Path("/etc/hbase/conf/hbase-site.xml"));
config.addResource(new Path("/etc/hadoop/conf/core-site.xml"));

String tableName = "users";

Connection connection = ConnectionFactory.createConnection(config);
Table table = connection.getTable(TableName.valueOf(tableName));

List<Delete> deleteList = new ArrayList<Delete>();

for (int rowKey = 500; rowKey <= 900; rowKey++) {
    deleteList.add(new Delete(Bytes.toBytes("user_" + rowKey)));
}

table.delete(deleteList);

推荐