Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bulk insert #124

Closed
Custardcs opened this issue May 4, 2018 · 1 comment
Closed

Bulk insert #124

Custardcs opened this issue May 4, 2018 · 1 comment

Comments

@Custardcs
Copy link

Hi, so I have a question I need to find a solution to.

we are using this client side and server side for now as its quick on most things. we have recently come into an issue with bulk inserting.

so ill give an example.

We have a table

image

they have key features that need to happen. the server needs to know if it has to insert it or update it. So we can get around that by the client sending up all the UPDATES at once and all the INSERTS at once.. this is where our problem comes

the sync ID stops duplications so the server needs to check if it exists before allowing it to insert.

`try {
//connect to DB
try {
var ProdDb = new Sqlite(path.join(db_path, 'DB/' + data_decrypt.DBname + '.db3'));
} catch (e) {
console.log(e);
}
//create if doesnt exist
var stmnt = ProdDb.prepare("CREATE TABLE IF NOT EXISTS Log (ID INTEGER PRIMARY KEY NOT NULL, TimeStamp varchar(254), Error varchar(254), SyncID varchar(254), DateChanged varchar(254), PageChanged varchar(254), SearchID varchar(254), Item varchar(254), Username varchar(254), Description varchar(254), EpsNum varchar(254))");
stmnt.run();

        var Prodbegin = ProdDb.prepare('BEGIN');
        var Prodcommit = ProdDb.prepare('COMMIT');
        Prodbegin.run();
        const Replacestmnt = ProdDb.prepare("REPLACE INTO Log (ID, TimeStamp, Error, SyncID, DateChanged, PageChanged, SearchID, Item, Username, Description, EpsNum) VALUES ($ID, $TimeStamp, $Error, $SyncID, $DateChanged, $PageChanged, $SearchID, $Item, $Username, $Description, $EpsNum)");
        //const Replacestmnt = ProdDb.prepare("REPLACE INTO Log VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)");
        const Insertstmnt = ProdDb.prepare("INSERT INTO Log (TimeStamp, Error, SyncID, DateChanged, PageChanged, SearchID, Item, Username, Description, EpsNum) VALUES ($TimeStamp, $Error, $SyncID, $DateChanged, $PageChanged, $SearchID, $Item, $Username, $Description, $EpsNum)");
        console.log(data_decrypt.DBname + " LOG START");
        for (let i = 0; i < data_decrypt.DataBaseRows.length; i++) {
            const Row = data_decrypt.DataBaseRows[i];

            //Creating TimeStamp
            var d = new Date(Date.now()),
                seconds = '' + d.getSeconds(),
                minutes = '' + d.getMinutes(),
                hour = '' + d.getHours(),
                month = '' + (d.getMonth() + 1),
                day = '' + d.getDate(),
                year = d.getFullYear();

            if (month.length < 2) month = '0' + month;
            if (day.length < 2) day = '0' + day;

            const TodaysDate = year + '-' + month + '-' + day + " " + hour + ":" + minutes + ":" + seconds;
            ////////////////////////////////////////////////////////////////////////////////////////////////
            const syncIDSearch = ProdDb.prepare("SELECT SyncID FROM Log WHERE SyncID ='" + Row.SyncID + "'").get();
            if (Row.TimeStamp == "U") {
                Replacestmnt.run({
                    ID: Row.ID,
                    TimeStamp: TodaysDate,
                    Error: Row.Error,
                    SyncID: Row.SyncID,
                    DateChanged: Row.DateChanged,
                    PageChanged: Row.PageChanged,
                    SearchID: Row.SearchID,
                    Item: "1",
                    Username: Row.Username,
                    Description: Row.Description,
                    EpsNum: Row.EpsNum
                });
            } else {
                if (syncIDSearch == 0) {
                    Insertstmnt.run({
                        TimeStamp: TodaysDate,
                        Error: Row.Error,
                        SyncID: Row.SyncID,
                        DateChanged: Row.DateChanged,
                        PageChanged: Row.PageChanged,
                        SearchID: Row.SearchID,
                        Item: "1",
                        Username: Row.Username,
                        Description: Row.Description,
                        EpsNum: Row.EpsNum
                    });
                }
            }
        }
        Prodcommit.run();
        console.log(data_decrypt.DBname + " LOG END");
        try {
            Prodbegin.run();
            const Rows = ProdDb.prepare("SELECT * FROM Log").all();

            const SendToClient = {
                Method: "PP_LOG",
                DBname: data_decrypt.DBname,
                DataBaseRows: Rows
            };
            //encrypt
            const encrypt = m_encrypt(SendToClient);
            const str = JSON.stringify({
                encryption: "ENCRYPTED",
                Data: encrypt
            })

            const buf = Buffer.from(str);
            socket.send(buf);

            Prodcommit.run();
        } catch (e) {
            Prodcommit.run();
        }
    } catch (e) {
        console.log(e);
    }`

so obviously its alot of data to handle we have 47K rows and the server just isnt dealing with it quick enough.

@JoshuaWise
Copy link
Member

Although I've helped answer questions like this in the past, as this package is getting more popular I'm going to relinquish my responsibility of helping people learn how to write proper SQLite and Node.js.

I haven't taken a deep look at your code but it appear to be fine at face-value. The de-facto answer to almost all performance questions in SQLite is to activate WAL mode:

db.pragma('journal_mode = WAL');

You can read about WAL mode here.

Any other issues regarding real-world usage will be closed as a duplicate of this new thread: #125

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

No branches or pull requests

2 participants