Performance: Processing large JSON in js/php and storing it to MySQL or process it in MySQL [closed]

What is the most efficient way to store a large JSON in MySQL database using PHP?

As I understand it my options are:

  1. Process it in JS – make an AJAX call to POST data (through a php script) in the database for every every key/value I want to store (can’t pass the JSON directly to php – throws object size error and I don’t want to mess with php.ini)

  2. Pass it as text to PHP , decode it and then do a mysqli_querry in a loop for every key/value pair I want to store.

  3. Pass it as text to MySQL and handle it there with MySQL built-in JSON functions.

What option would be more perfomant?

Answer

Generally, you need to decide which fields of the JSON string that you will be searching on via SQL. Put those fields into their own columns. Then the JSON (with or without copies of those fields) is simply an opaque TEXT or JSON column. It will be written and read as a unit; the client will unpack as needed.

PHP has decent functions for handling JSON. I suggest sticking with UTF-8 encoding of the JSON and using JSON_UNESCAPED_UNICODE. If stored into TEXT, be sure to have utf8mb4; the collation does not matter. (The MySQL/MariaDB details vary with the exact version you are using.)

You ask for “performant”. By breaking out the columns needed for searching, MySQL indexing will work much faster than dynamically picking apart JSON. (Again the version matters when considering indexing JSON fields and/or using GENERATED columns.)

“Key-value” (EAV) schema is clumsy and bad for performance. So I hope you don’t need to create arbitrary tests across multiple key-value pairs. I discuss that further here: http://mysql.rjweb.org/doc.php/eav