作为轻量级的本地存储方式,对于构建不依赖服务器的小型项目,用LowDB存储和管理数据是十分理想的选择。在Nodejs, Electron and browser等一些小型项目中经常能看到LowDB的身影。
https://Github.com/typicode/lowdb
1. 使用方式
npm install lowdb
或者:
yarn add lowdb
const low = require('lowdb');
const FileSync = require('lowdb/adapters/FileSync'); // 有多种适配器可选择
const adapter = new FileSync('db.json'); // 申明一个适配器
const db = low(adapter);
db.defaults({posts: [], user: {}, count: 0})
.write();
db.get('posts')
.push({id: 1, title: 'lowdb is awesome'})
.write()
db.set('user.name', 'typicode')
.write()
db.update('count', n => n + 1)
.write()
运行程序会在项目中添加db.json文件,里面存储了添加的数据:
{
"posts": [
{
"id": 1,
"title": "lowdb is awesome"
}
],
"user": {
"name": "typicode"
},
"count": 1
}
lowdb是基于lodash构建的,所以可以使用任何lodash强大的函数,比如: _.get() 和 _.find(),并且可以串联地使用:
db.get('users')
.find({sex: 'male'})
.value()
2. API
函数 功能
low(adapter) 返回一个具有特定属性和功能的 lodash chAIn
db.[...].write() / .value() 写 / 读数据
db.getState() / .setState() 获取 / 设置数据库的状态
db._ 数据库lodash的实例,可以利用这个添加自己的函数或者第三方的mixins,比如lodash-id
db._.mixin({
second: function(array) {
return array[1]
}
})
db.get('posts')
.second()
.value()
3. Adapters API
针对lowdb自带的适配器:FileSync、FileAsync 和 LocalBrowser,有以下可选参数:
defaultValue: 文件不存在时的默认值;
serialize/deserialize: 写之前和读之后的操作。
const adapter = new FilSync('db.json',{
serialize: (data) => encrypt(JSON.stringify(data)),
deserialize: (data) => JSON.parse(decrypt(data))
})
4. 查询
可以直接使用lodash的函数进行查询。需要注意的是有些操作可能会导致原数据被修改,为了避免这种误操作,需要使用 .cloneDeep(),操作都是惰性的,只有调用 .value()或 .write()后才会正式执行。
检查users是是否存在
db.has('users')
.value()
设置users
db.set('users', [])
.write()
排序、选择
db.get('users')
.filter({sex: 'male'})
.sortBy('age')
.take(5)
.value()
获取特定字段
db.get('users')
.map('name')
.value()
获取数量
db.get('users')
.size()
.value()
获取特定信息
db.get('users[0].name')
.value()
更新信息
db.get('users')
.find({name: 'Tom'})
.assign({name: 'Tim'})
.write()
删除信息
db.get('users')
.remove({name: 'Time'})
.write()
移除属性
db.unset('users.name)
.write()
深拷贝
db.get('users')
.cloneDeep()
.value()
5. 使用id索引
可以使用 shortid 和 lodash-id 为数据库中的每一条记录创建唯一的id索引,然后通过id检索操作记录:
const shortid = require('shortid')
const postId = db
.get('posts')
.push({ id: shortid.generate(), title: 'low!' })
.write()
.id
const post = db
.get('posts')
.find({ id: postId })
.value()
const lodashId = require('lodash-id')
const FileSync = require('lowdb/adapters/FileSync')
const adapter = new FileSync('db.json')
const db = low(adapter)
db._.mixin(lodashId)
// We need to set some default values, if the collection does not exist yet
// We also can store our collection
const collection = db
.defaults({ posts: [] })
.get('posts')
// Insert a new post...
const newPost = collection
.insert({ title: 'low!' })
.write()
// ...and retrieve it using its id
const post = collection
.getById(newPost.id)
.value()
6. 自定义Adapter
low( ) 函数接受自定义的Adapter
class MyStorage {
constructor() {
// ...
}
read() {
// Should return data (object or array) or a Promise
}
write(data) {
// Should return nothing or a Promise
}
}
const adapter = new MyStorage(args)
const db = low(adapter);
==============================================
英文官网介绍,更加简洁
Install
npm install lowdb
Usage
Lowdb 3 is a pure ESM package. If you're having trouble importing it in your project, please read this.
import { join, dirname } from 'path'
import { Low, JSONFile } from 'lowdb'
import { fileURLToPath } from 'url'
const __dirname = dirname(fileURLToPath(import.meta.url));
// Use JSON file for storage
const file = join(__dirname, 'db.json')
const adapter = new JSONFile(file)
const db = new Low(adapter)
// Read data from JSON file, this will set db.data content
await db.read()
// If file.json doesn't exist, db.data will be null
// Set default data
// db.data = db.data || { posts: [] } // Node < v15.x
db.data ||= { posts: [] } // Node >= 15.x
// Create and query items using plain JS
db.data.posts.push('hello world')
const firstPost = db.data.posts[0]
// Alternatively, you can also use this syntax if you prefer
const { posts } = db.data
posts.push('hello world')
// Finally write db.data content to file
await db.write()
// db.json
{
"posts": [ "hello world" ]
}
TypeScript
You can use TypeScript to type check your data.
type Data = {
words: string[]
}
const adapter = new JSONFile<Data>('db.json')
const db = new Low(adapter)
db.data
.words
.push('foo') // ✅
db.data
.words
.push(1) // ❌
Lodash
You can also add lodash or other utility libraries to improve lowdb.
import lodash from 'lodash'
type Post = {
id: number;
title: string;
}
type Data = {
posts: Post[]
}
// Extend Low class with a new `chain` field
class LowWithLodash<T> extends Low<T> {
chain: lodash.ExpChain<this['data']> = lodash.chain(this).get('data')
}
const adapter = new JSONFile<Data>('db.json')
const db = new LowWithLodash(adapter)
await db.read()
// Instead of db.data use db.chain to access lodash API
const post = db.chain
.get('posts')
.find({ id: 1 })
.value() // Important: value() must be called to execute chain
More examples
For CLI, server and browser usage, see examples/ directory.
API
Classes
Lowdb has two classes (for asynchronous and synchronous adapters).
new Low(adapter)
import { Low, JSONFile } from 'lowdb'
const db = new Low(new JSONFile('file.json'))
await db.read()
await db.write()
new LowSync(adapterSync)
import { LowSync, JSONFileSync } from 'lowdb'
const db = new LowSync(new JSONFileSync('file.json'))
db.read()
db.write()
Methods
db.read()
Calls adapter.read() and sets db.data.
Note: JSONFile and JSONFileSync adapters will set db.data to null if file doesn't exist.
db.data // === null
db.read()
db.data // !== null
db.write()
Calls adapter.write(db.data).
db.data = { posts: [] }
db.write() // file.json will be { posts: [] }
db.data = {}
db.write() // file.json will be {}
Properties
db.data
Holds your db content. If you're using the adapters coming with lowdb, it can be any type supported by JSON.stringify.
For example:
db.data = 'string'
db.data = [1, 2, 3]
db.data = { key: 'value' }
Adapters
Lowdb adapters
JSONFileJSONFileSync
Adapters for reading and writing JSON files.
new Low(new JSONFile(filename))
new LowSync(new JSONFileSync(filename))
MemoryMemorySync
In-memory adapters. Useful for speeding up unit tests.
new Low(new Memory())
new LowSync(new MemorySync())
LocalStorage
Synchronous adapter for window.localStorage.
new LowSync(new LocalStorage(name))
TextFileTextFileSync
Adapters for reading and writing text. Useful for creating custom adapters.
Third-party adapters
If you've published an adapter for lowdb, feel free to create a PR to add it here.
Writing your own adapter
You may want to create an adapter to write db.data to YAML, XML, encrypt data, a remote storage, ...
An adapter is a simple class that just needs to expose two methods:
class AsyncAdapter {
read() { /* ... */ } // should return Promise<data>
write(data) { /* ... */ } // should return Promise<void>
}
class SyncAdapter {
read() { /* ... */ } // should return data
write(data) { /* ... */ } // should return nothing
}
For example, let's say you have some async storage and want to create an adapter for it:
import { api } from './AsyncStorage'
class CustomAsyncAdapter {
// Optional: your adapter can take arguments
constructor(args) {
// ...
}
async read() {
const data = await api.read()
return data
}
async write(data) {
await api.write(data)
}
}
const adapter = new CustomAsyncAdapter()
const db = new Low(adapter)
See src/adapters/ for more examples.
Custom serialization
To create an adapter for another format than JSON, you can use TextFile or TextFileSync.
For example:
import { Adapter, Low, TextFile } from 'lowdb'
import YAML from 'yaml'
class YAMLFile {
constructor(filename) {
this.adapter = new TextFile(filename)
}
async read() {
const data = await this.adapter.read()
if (data === null) {
return null
} else {
return YAML.parse(data)
}
}
write(obj) {
return this.adapter.write(YAML.stringify(obj))
}
}
const adapter = new YAMLFile('file.yaml')
const db = new Low(adapter)
Limits
Lowdb doesn't support Node's cluster module.
If you have large JAVAScript objects (~10-100MB) you may hit some performance issues. This is because whenever you call db.write, the whole db.data is serialized using JSON.stringify and written to storage.
Depending on your use case, this can be fine or not. It can be mitigated by doing batch operations and calling db.write only when you need it.
If you plan to scale, it's highly recommended to use databases like PostgreSQL or MongoDB instead.