Bulk Update of data in CockroachDB
Working with customers always provides a large number of topics to discuss; however, in recent weeks the same type of scenario has surfaced a number of times. "I have a very large table and I want to update a large portion of these records. I wrote an UPDATE sql statement but it never finishes." So how does this situation come about? There are three main scenarios that have repeatedly led to the need to bulk update data within CockroachDB. Data from a legacy datastore has been imported into CockroachDB without any sanitization. Ideally, some form of ETL workflow has been utilized and only clean data is being stored in CockroachDB. An application inserts data over a period of time that also hasn't been properly sanitized. A business use case arises that entails updating data in place. Now, once the malformed data is in CockroachDB, something needs to be done to fix the data. Let's take an example table like the one below. table_name ...