Keystore

origin

What is a keystore file

A keystore file is en encrypted version of your unique private key that you will use to sign your transactions. If you lose this file your lose your assets.

What do keystore files look like

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
{
"crypto": {
"cipher": "aes-128-ctr",
"cipherparams": {
"iv": "83dbcc02d8ccb40e466191a123791e0e"
},
"ciphertext": "d172bf743a674da9cdad04534d56926ef8358534d458fffccd4e6ad2fbde479c",
"kdf": "scrypt",
"kdfparams": {
"dklen": 32,
"n": 262144,
"r": 1,
"p": 8,
"salt": "ab0c7876052600dd703518d6fc3fe8984592145b591fc8fb5c6d43190334ba19"
},
"mac": "2103ac29920d71da29f15d75b4a16dbe95cfd7ff8faea1056c33131d846e3097"
},
"id": "3198bc9c-6672-5ab3-d995-4942343ae5b6",
"version": 3
}

Thses fields mean:

  • cipher: The name of a symmetric AES algorithm

  • cipherparams: The parameters required for the “cipher” algorithm above

  • ciphertext: The private key encrypted using the “cipher” algorithm above

  • kdf: A key derivation function used to let you encrypt your keystore file with a password

  • kdfparams: The parameters required for the “kdf” algorithm above

  • mac: A code used to verify your password

Work flow

  1. Encrypting your private key

These symmetric algorithms use a key to encrypt some data. The resulting data is encrypted and can be decrypted with the same method and the same key.

The relation between cipher, cipherparams, ciphertext:

  • cipher is the symmetric algorithm used to encrypt the private key.

  • cipherparams are the parameters required for teh symmetric algorithm.

  • ciphertext is the encrypted input of the symmetric input.

You get the decryption-key as the output of the kdf.

By this, you need to retrieve your decryption-key(namely the key used in encryption) to decrypt your private key.

  1. Protect with your passphrase

To make sure unlocking your account is easy, you don’t need to remember your very long and non-user-friendly decrption-key that is used to decrypt ciphertext. Instead, the developers have opted for a passphrase-based protection.

The keystore use a kdf(key derivation function) that computes the decryption-key given a passphrase and a list of parameters.

  • kdf is the key derivation function used to compute the decryption-key from your passphrase.

  • kdfparams are the parameters required for the function

By the passphrase with kdfparams, the kdf returns your decrption-key.

  1. Make sure your passphrase is right

We need to guarantee that the passphrase typed to unlock the account is right, that it is the same one as the one entered when the keystore is generated.

This is where the field mac in the keystore works. Just after the kdf is executed, its result(decryption-key) and ciphertext are processed and compared to mac. If the result is the same as mac, then the passphrase was right and the decrption-key is correct.

Conclusion

Note on Bip32

wiki

The specification consists of two parts:

  1. a system for deriving a tree of keypairs from a single seed.

  2. demostrate how to build a wallet structure on top of such a tree.

Specification: Key derivation

Convention

In this text we assume the public key cryptography used in Bitcoin, namely elliptic curve cryttography using the field and curve parameters defined by secp256k1. Variables below are either:

  • Integers modulo the order of the curve (referred to as n)

  • Coordinates of points on the curve

  • Byte sequences

Addition (+) of two coordinate pair is defined as application of the EC group operation. Concatenation(||) is the operation of appending one byte sequence onto another.

Extended Keys

We are going to define a function to derive a number of child keys from a parent key. In order to prevent these from depending solely on the key itself, we extend both private key and public key first with an extra 256 bits of entropy. This extention, called the chain code, is identical for corresponding private key and public keys, and consist of 256 bits, namely 32 bytes.

We represent an extended private key as (k, c), with k the normal private key and the c, the chain code. And extended public key is represented as (K, c), with K = point(k) and the c the chain code.

Each extended key has 2^31 normal child keys, and 2^31 hardened child keys. Each of these child keys has an index. The normal child keys use indices 0 through 2^31 - 1.

The hardened child keys use indices 2^31 - 1 through 2^32 - 1.

To ease notation for hardened key indices, a number i_H represents i + 2 ^ 31.

Child key derivation (CKB) function

Given a parent extended key and an index i, it is possible to compute the corresponding child extended key. The algorithm to do so depends on whether the child is a hardened key or not.

Private parent key => private child key

The function CKDpriv((k_par, c_par), i) => (k_i, c_i) computes a child extended private key from the parent extended private key.

  • Check whether i >= 2^31(whether the child is a hardened key)
    • if so (hardened child): return failure
    • if not (normal child), let I = HMAC-SHA512(Key = c_par, Data = ser_P(K_par)||ser_32(i))
  • Split I into two 32-byte sequences, I_L, and I_R.
  • The returned child key k_i is parse_256(I_L) + k_par(mod n)
  • The returned chain code c_i is I_R

Private parent key => public child key

Public parent key => private child key

This is not possible.

The key tree

The next step is cascading serveral CKD construction to build a tree. We start with one root, the master extended key m. By evaluating CKBpriv(m, i) for several values of i, we get a number of level-1 derived nodes. As each of these is again an extended key, CKDpriv can be applied to those as well.

To shorten notation, we will write CKDpriv(CKDpriv(CKDpriv(m, 3_H), 2), 5) as m/3_H/2/5. Equivalently for public keys, we write CKDpub(CKDpub(CKDpub(M, 3), 2), 5) as M/3/2/5. This results in the following identities:

  • N(m/a/b/c) = N(m/a/b)/c = N(m/a)/b/b = N(m)/a/b/c = M/a/b/c
  • N(m/a_H/b/c) = N(m/a_H/b)/c = N(m/a_H)/b/c

However, N(m/a_H) cannot be rewritten as N(m)/a_H, as the latter is not possible.

Key identifiers

Extended keys can be identified by the Hash160(RIPED160 after SHA256) of the serialized ECDSA public key K, ignoring the chain code. This corresponds exactly to the data used in traditional Bitcoin addresses. It is not advised to represent this data in base58 format though, as it may be interpreted as an address that way.

The first 32 bits of the identifier are called key finterprint.

Serialization format

Extended public and private keys are serialized as follows:

  • 4 bytes: version bytes (mainnet: 0x0488B21E public, 0x0488ADE4 private; testnet: 0x043587CF public, 0x04358394 private)
  • 1 byte: depth: 0x00 for master nodes, 0x01 for level-1 derived keys.
  • 4 bytes: the fingerprint of the parent’s key (0x00000000 if master key)
  • 4 bytes: child number, 0x00000000 if master key
  • 32 bytes: the chain code
  • 33 bytes: the public key or private key data

The 78 bytes structure can be encoded like other Bitcoin data in Base58, by first adding 32 checksum bits(derived from the double SHA256 checksum), and then conventing to the Base58 representation.

Master key generation

The total number of possible extended keypairs is almost 2^512, but the produced keys are only 256 bits long, and offer about half of that in terms of security. Therefore the master keys are not generated directly, but instead from a potentially short seed value.

  • Generate a seed byte sequence S of a chosen length(between 128 and 512 bits, 256 bits is advised) from a (P)RNG.
  • Calculate I = HMAC-SHA512(Key = “Bitcoin seed”, Data = S)
  • Split I into two 32-byte sequence, I_L and I_R
  • Use parse_256(I_L) as master secret key and I_R as master chain code.

In case I_L is 0 or >= n, the master key is invalid.

Specification: Wallet Structure

The default wallet layout

An HD Wallet is organized as several ‘accounts’. Accounts are numbered, the default account (“”) being numbered 0. Clients are not required to support more than one account, if not, they only use the default account.

Each account is composed of two keypair chains: an internal and an external one. The external keychain is used to generate new public addresses, while the internal one is used for all other operations(change addresses, generate addresses, anything that doesn’t need to be communicated).

React Hooks

Basic Hooks

useState

1
const [state, setState] = useState(initialState)

Returns a stateful value and a function to update it.

During subsequent re-renders, the first value returned by useState will always be the most recent state after applying updates.

Functional updates

If the new state is computed using the previous state, you can pass a function to setState. The function will receive the previous value and return an updated value.

Note: Unlike the setState method found in class component, useState does not automatically merge upate objects.

useReducer is more suited for managing state objects that contains multiple sub-values.

Lazy initialization

The initialState argument is the state used during the initial render. In subsequent renders, it is disregarded. If the initial state is the result of an expensive computation, you may provide a function instead, which will be executed only on the initial render.

useEffect

1
useEffect(didUpdate)

Accept a function that contains imperative, possibly effectful code.

Mutations, subscriptions, timers, logging and other side effects are not allowed inside the main body of a function component. Instead, use useEffect. The functino passed to useEffect will run after the render is committed to the screen.

By default, effects run after every completed render, but you can choose to fire it only when certain values have changed.

Cleaning up an effect

1
2
3
4
5
6
7
useEffect(() => {
const subscription = props.source.subscribe()
return () => {
// Clean up the subscription
subscription.unsubscribe()
}
})

Timing of effects

Unlike componentDidMount and componentDidUpdate, the function passed to useEffect fires after layout and paint, during a deferred event. This makes it suitable for the many common side effects, like setting up subscriptions and event handlers, because most types fo work shouldn’t block the browser from updating the screen.

useMutationEffect and useLayoutEffect have the same signature as useEffect and only differ in when they are fired.

Although useEffect is deferred until after the browser has painted, it’s guaranteed to fire before any new renders.

Conditionally firing an effect

1
2
3
4
5
6
7
useEffect(
() => {
const subscription = props.source.subscribe()
return () => subscription.unsubscribe()
},
[props.source],
)

useContext

1
const context = useContext(Context)

Accepts a context object(the value returned from React.createContext) and returns the current context value, as given by the nearest context provider for the given context.

When the provider updates, this hook will trigger a rerender with the latest context value.

Additional Hooks

useReducer

1
const [state, dispatch] = useReducer(reducer, initialState)

An alternative to useState. Accepts a reducer of type (state, action) => newState, and returns the current state paired with a dispatch method.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
const initialState = { count: 0 }

function reducer(state, action) {
switch (action.type) {
case 'reset': {
return initialState
}
case 'increment': {
return { count: state.count + 1 }
}
case 'decrement': {
return { count: state.count - 1 }
}
default:
return state
}
}

function Counter({ initialCount }) {
const [state, dispatch] = useReducer(reducer, { count: initialCount })
return (
<>
Count: {state.count}
<button onClick={() => dispatch({ type: 'reset' })}>Reset</button>
</>
)
}

Lazy initialization

useReducer accepts an optional third argument, initialAction. If provided, the initial action is applied during the initial render. This is useful for computing an initial state that includes values passed via props.

useCallback

1
2
3
4
5
6
const memoizedCallback = useCallback(
() => {
doSomething(a, b)
},
[a, b],
)

Returns memoized callback.

Pass an inline callback and an array of inputs, useCallback will return a memoized version of the callback that only changes if one of the inputs has changed. This is useful when passing callbacks to optimized child components that rely on reference equality to prevent unnecessary render.

Note: The array of inputs is not passed as arguments to the callback. Conceptually, though, that’s what represent: every value referenced inside the callback should also appear in the inputs array.

useMemo

1
const memoizedValue = useMemo(() => computeExpensiveValue(a, b), [a, b])

Returns a memoized value.

Pass a “create” function and an array of inputs, useMemo will only recompute the memoized value when one of the inputs has changed. This optimization helps to avoid expensive calculations on every render.

If no array is provided, a new value will be computed whenever a new function instance is passed as the first argument.(With an inline function, on every render)

useRef

1
const refContainer = useRef(initialValue)

useRef returns a mutable ref object whose .current property is initialized to the passed argument(initialValue). The returned object will persist for the full lifetime of the component.

A common use case is to access a child imperatively:

1
2
3
4
5
6
7
8
9
10
11
12
function TextInputFocusButton() {
const inputEl = useRef(null)
const onButtonClick = () => {
inputEl.current.focus()
}
return (
<>
<input ref={inputEl} type="text" />
<button onClick={onButtonClick}>Focus the input</button>
</>
)
}

useImperativeMethods

1
useImperativeMethods(ref, createInstance, [inputs])

useImperativeMethods customizes the instance value that is exposed to parent component when use ref. As always, imperative code using refs should be avoided in most cases. useImperativeMethods should be used with forwardRef

1
2
3
4
5
6
7
8
9
10
function FancyInput(props, ref) {
const inputRef = useRef()
useImperativeMethods(ref, () => ({
focus: () => {
inputRef.current.focus()
},
}))
return <input ref={inputRef} />
}
FancyInput = forwardRef(FancyInput)

In this example, a parent component that renders <FancyInput ref={fancyInputRef} /> would be able to call fancyInputRef.current.focus().

useMutationEffect

The signature is identical to useEffect, but it fires synchronously during the same phase that React performs its DOM mutations, before sibling components has been updated.

Prefer the standard useEffect when possible to avoid blocking visual updates.

Note: Avoid reading from the DOM in useMutationEffect. If you do, you can cause performance problems by introducing layout trash.

useLayoutEffect

The signature is identical to useEffect but it fires synchronously after all DOM mutations.

Use this to read layout from the DOM and synchronously re-render. Updates scheduled inside useLayoutEffect will be flushed synchronously, before the browser has a chance to paint.

Fires in the same phase as componentDidMount and componentDidUpdate

5 Tips in CRA@2.0

Origin

It’s fucking long time not learning React.

Five Tips

  1. Displaying Lint Error in the Editor

Create .eslintrc with content of

1
2
3
{
"extends: "react-app"
}
  1. Formatting Code Automatically
1
npm install --save-dev prettier husky lint-staged

And add scripts in package.json

1
2
3
4
5
6
7
8
9
10
{
"husky": {
"hooks": {
"pre-commit": "lint-staged"
}
},
"lint-staged": {
"src/**/*.{js,jsx,json,css}": ["prettier --write", "git add"]
}
}

I prefer to use prettier with eslint

  1. Developing Components in Isolation

I use bit

  1. Making a Progressive Web App

In src/index.js, change serviceWorker.unregister() to serviceWorker.register()

  1. Code Splitting

Create React App v2 supports code splitting via dynamic import() statements. That is, if it encouters a call to import('./someModule') when building your app, it will create a new chunk for someModule and all its dependencies, totally seperate from your entry bundle.

1
2
3
4
5
6
7
8
9
10
11
import React, { Component } from 'react'
import { Formik } from 'formik'
import * as Yup from 'yup'

const formValidator = Yup.object().shape(/* ... */)

export default class Form extends Component {
render() {
return <Formik validationSchema={formValidator}>{/* ... */}</Formik>
}
}
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
import React, { Component } from 'react'

export default class App extends Component {
state = {
Form: undefined,
}

showForm = async () => {
const { default: Form } = await import('./Form')
this.setState({ Form })
}
render() {
const { Form } = this.state
return <div className="app">{Form ? <Form /> : <button onClick={this.showForm}>Show Form</button>}</div>
}
}

Assembly in Solidity

Inline Assembly

1
2


Analysis of Ujo

What is Ujo

The Ujo platform uses blockchain technology to create a transparent and decentralized database of rights and rights owners, automating royalty payments using smart contracts and cryptocurrency.

Ujo Music is a ConsenSys spoke, with a vision for a music industry that allows creators to grow and thrive independently. It is a platform that uses the ethereum blockchain as the substrate for innovation by empowering artists, digitizing their music rights and metadata, sharing this information in an open environment, thus enabling new applications, products and services to license their catelogs and pay artists directly with minimal friction.

History

2015.10, IMOGEN HEAP: In 2015 Imogen Heap collaborated with Ujo to demonstrate how Ethereum could usher in a modern music supply chain built on a backbone of prompt and transparent payments

Why use Ujo

How does Ujo work

Reference

Development of Neuron Web Extension

Create Project

1
create-react-app neuron-web scripts-version=react-scripts-ts

Add Manifest

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
// ./public/manifest.json

{
"short_name": "NeuronWeb",
"name": "Neuron Web",
"start_url": "./index.html",
"display": "standalone",
"theme_color": "#000",
"background_color": "#fff",
"browser_action": {
"default_popup": "./index.html",
"default_title": "NervosWeb"
},
"manifest_version": 2,
"version": "1.0",
"permissions": [
"tabs",
"activeTab",
"clipboardWrite",
"http://**/*",
"https://**/*"
],
"content_security_policy": "script-src 'self' 'unsafe-eval'; object-src 'self'"
}

Add Style Dependencies

1
<link rel="stylesheet" href="https://fonts.googleapis.com/css?family=Roboto:300,400,500">
1
yarn add @mateiral-ui/{core,icons}

Details in Web3 -- Contract

Basic Usage

1
2
3
4
5
// To initialize a contract

var Contract = require('web3-eth-contract')
Contract.setProvider('ws://localhost:8546')
var contract = new Contract(abi, address, options)

Import

1
2
3
4
5
6
7
8
9
var _ = require('underscore')
var core = require('web3-core')
var Method = require('web3-core-method')
var utils = require('web3-utils')
var Subscription = require('web3-core-subscription').subscription
var formatters = require('web3-core-helpers').formatters
var errors = require('web3-core-helpers').errors
var promiEvent = require('web3-core-promievent')
var abi = require('web3-eth-abi')

Contract Constructor

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
/*
* Should be called to create new contract instance
*
* @method Contract
* @constructor
* @param {Array} jsonInterface
* @param {String} address
* @param {Object} options
*/

var Contract = function Contract(jsonInterface, address, options) {
var _this = this
var args = Array.prototype.slice.call(arguments)

// check if the instance is initialized correctly
if (!(this instanceof Contract)) {
throw new Error(
'Please use the "new" keyword to instantiate a web3.eth.contract() object!',
)
}

// sets _requestManager
core.packageInit(this, [this.constructor.currentProvider])

// set clearSubscriptions
this.clearSubscriptions = this._requestManager.clearSubscriptions
// check params jsonInterface
if (!jsonInterface || !Array.isArray(jsonInterface)) {
throw new Error(
'You must provide the json interface of the contract when instantiating a contract object',
)
}

// clear the options object
this.options = {}

var lastArg = args[args.length - 1]
// set options
if (_.isObject(lastArg) && !_.isArray(lastArray)) {
options = lastArgs
this.options = _.extend(this.options, this._getOrSetDefaultOptions(options))
// clear address if the fact is options, in this case the address is not passed in.
if (_.isObject(address)) {
address = null
}
}

// set address
Object.defineProperty(this.options, 'address', {
set: value => {
if (value) {
_this._address = utils.toChecksumAddress(
formatters.inputAddressFormatter('value'),
)
}
},
get: () => {
return _this._address
},
enumerable: true,
})

// add method and event signature when the jsonInterface gets set
Object.defineProperty(this.options, 'jsonInterface', {
set: value => {
_this.methods = {}
_this.events = {}

_this._jsonInterface = value.map(method => {
var func, funcName
if (method.name) {
funcName = utils._jsonInterfaceMethodToString(method)
}

if (method.type === 'function') {
method.signature = abi.encodeFunctionSignature(funcName)
func = _this._createTxObject.bind({
method: method,
parent: _this,
})

// add method only if not one already exists
if (!_this.methods[method.name]) {
_this.methods[method.name] = func
} else {
var cascadeFunc = _this._createTxObject.bind({
method: method,
parent: _this,
nextMethod: _this.methods[method.name],
})
this.methods[method.name] = cascadeFunc
}
// definitely add the method based on its signature
_this.methods[method.signature] = func
//add method by name
_this.methods[method.name] = func
// event
} else if (method.type === 'event') {
method.signature = abi.encodeEventSignature(funcName)
var event = _this._on.bind(_this, method.signature)

// add method only if not already exists
if (
!_this.events[method.name] ||
_this.events[method.name].name === 'bound'
) {
_this.events[method.name] = event
}

// definitely add the method based on its signature
_this.events[method.signature] = event

// add event by name
_this.events[funcName] = event
}

return method
})

// add allEvents
_this.events.allEvents = _this._on.bind(_this, 'allevents')

return _this._jsonInterface
},
get: () => {
return _this._jsonInterface
},
enumerable: true,
})

// get default account from the Class
var defaultAccount = this.constructor.defaultAccount
var defaultBlock = this.constructor.defaultBlock || 'latest'

Object.defineProperty(this, 'defaultAccount', {
get: () => {
return defaultAddress
},
set: value => {
if (value) {
defaultAccount = utils.toChecksumAddress(
formatters.inputAddressFormatter(value),
)
}
return value
},
enumerable: true,
})

Object.defineProperty(this, 'defaultBlock', {
get: () => {
return defaultBlock
},
set: value => {
defaultBlock = value
return value
},
enumerable: true,
})

// properties
this.methods = {}
this.events = {}
this._address = null
this._jsonInterface = []

// set getter/setter properties
this.options.address = address
this.options.jsonInterface = jsonInterface
}

Contract.setProvider = function(provider, accounts) {
// Contract.currentProvider = provider
core.packageInit(this, [provider])
this._ethAccounts = accounts
}

/**
* Get the callback and modify the array if necessary
*
* @method _getCallback
* @param {Array} args
* @return {Function} the callback
*/
Contract.prototype._getCallback = function getCallback(args) {
if (args && _.isFunction(args[args.length - 1])) {
return args.pop() // modify the args array!
}
}

/**
* Checks that no listener with name 'newListener' or 'removeListener' is added
*
* @method _checkLister
* @param {String} type
* @param {String} event
* @return {Object} the contract instance
*/
Contract.prototype._checkListerner = function(type, event) {
if (event === type) {
throw new Error(
`The event ${type} is a reserved event name, you can't use it`,
)
}
}

/**
* Use default values, if options are not avaible
*
* @method _getOrSetDefaultOptions
* @param {Object} options the options given by the user
* @return {Object} the options with gaps filled by defaults
*/
Contract.prototype._getOrSetDefaultOptions = function getOrSetDefaultOptions(
options,
) {
var gasPrice = options.gasPrice ? String(options.gasPrice)} : null
var from = options.from ? utils.toChecksumAddress(formatters.inputAddressFormatter(options.from)): null

options.data = option.data || this.options.data

option.from = from || this.options.from
option.gasPrice = gasPrice || this.options.gasPrice
options.gas = options.gas || options.gasLimit || this.options.gas

delete options.gasLimit
return options
}

// TODO:
// _encodeEventABI
// _decodeEventABI
// _encodeMethodABI
// _decodeMethodABI
// _decodeMethodReturn

/**
* Deploys a contract and fire events based on its state: transactionHash, receipt
*
* All event listeners will be removed, once the last possible event is fired ('error' or 'receipt')
*
* @method deploy
* @param {Object} options
* @param {Function} callback
* @return {Object} EventEmitter possible are "error", "transactionHash", and "receipt"
*/
Contract.prototype.deploy = function(options, callback) {
options = options || {}
options.arguments = options.arguments || []
options = this._getOrSetDefaultOptions(options)

// return error if no 'data' is specified
if (!options.data) {
return utils._fireError(new Error('No "data" specifed in neither the given options, nor the default options'), null, null, callback)
}

var constructor = _.find(this.options.jsonInterface, function(method) {
return (method.type === 'contructor')
}) || {}

constructor.signature = 'constructor'

return this._createTxObject.apply({
method: constructor,
parent: this,
deployData: options.data,
_ethAccounts: this.constructor._ethAccounts
}, options.arguments)
}

// TODO:
// _generateEventOptions
// clone
// once
// _on
// getPastEvents

/**
* returns the an object with call, send, estimate functions
*
* @method _createTxObject
* @return {Object} an object with functions to call the method
*/

Contract.prototype._createTxObject = function _createTxObject(){
var args = Array.prototype.slice.call(arguments)
var txObject = {}

if (this.method.type === 'function') {
txObject.call = this.parent._executeMethod.bind(txObject, 'call')
txObject.call.request = this.parent._executeMethod.bind(txObject, 'call', true) // to make batch request
}

txObject.send = this.parent._executeMethod.bind(txObject, 'send')
txObject.send.request = this.parent._executeMethod.bind(txObject)
txObject.encodeABI = this.parent._encodeMethodABI.bind(txObject)
txObject.estimateGas = this.parent._executeMethod.bind(txObject, 'estimate')

// check arguments length
if (args && this.method.inputs && args.length !== this.method.inputs.length) {
if (this.nextMethod) {
return this.nextMethod.apply(null, args)
}
throw errors.InvalidNumberOfParams(args.length, this.method.inputs.length, this.method.name)
}

txObject.arguments = args || []
txObject._method = this.method
txObject._parent = this.parent
txObject._ethAccounts = this.parent.constructor._ethAccounts || this._ethAccounts

if(this.deployData) {
txObject._deployData = this.deployData
}

return txObject
}

// _processExecuteArguments

/**
* Execute a call, transact or estimateGas on a contract function
*
* @method _executeMethod
* @param {String} type the type this execute function should execute
* @param {Boolean} makeRequest if true, it simply returns the request parameters, rather than execute it.
*/

Contract.prototype._executeMethod = function _executeMethod () {
var _this = this
var args = this._parent._processExecuteArguments.call(this, Array.prototype.slice.call(arguments))
var defer = promiEvent((args.type !== 'send))
var ethAccounts = _this.constructor._ethAccounts || _this._ethAccounts

// simply return request for batch requests

if (args.generateRequest) {
// handle batch request
var payload = {
params: [formatters.inputCallFormatter.call(this._parent, args.options)],
callback: args.callback,
}

if (args.type === 'call') {
payload.params.push(formatters.inputDefaultBlockNumber.call(this._parent, args.defaultBlockNumber))
payload.method = 'eth_call'
payload.format = this._parent.decodeMethodReturn.bind(null, this._method.outputs)
} else {
payload.method = 'eth_sendTransaction'
}
return payload
} else {
// handle call or send
switch (args.type) {
case 'estimate': {
var estimateGas = (new Method({
name: 'estimateGas',
call: 'eth_estimateGas',
params: 1,
inputFormatter: [formatter.inputCallFormatter],
outputFormatter: utils.hexToNumber,
requestManager: _this._parent._requestManager,
accounts: ethAccounts, // specify accounts
defaultAccount: _this._parent.defaultAccount,
defaultBlock: _this._parent.defaultBlock,
})).createFunction();
return estimateGas(args.options, args.callback)
}
case 'call': {
var call = (new Method({
name: 'call',
call: 'eth_call',
params: 2,
inputFormatter: [formatters.inputCallFormatter, formatters.inputDefaultBlockNumberFormatter],
outputFormatter: result => {
return _this._parent.decodeMethodReturn(_this._method.outputs, result)
},
requestManager: _this._parent._requestManager,
accounts: ethAccounts, // is eth.accounts (necessary for wallet signing)
defaultAccount: _this._parent.defaultAccount,
defaultBlock: _this._parent.defaultBlock,
})).createFunction()

return call(args.options, args.defaultBlock, args.callback)
}
case 'send': {
if(!utils.isAddress(args.options.from)) {
return utils._fireError(new Error('No "from" address specified in neither the given options, nor the default options.'), defer.eventEmitter, defer.reject, args.callback);
}

if (_.isBoolean(this._method.payable) && !this._method.payable && args.options.value && args.options.value > 0) {
return utils._fireError(new Error('Can not send value to non-payable contract method or constructor'), defer.eventEmitter, defer.reject, args.callback);
}

// make sure receipt logs are decoded
var extraFormatters = {
receiptFormatter: receipt => {
if (_.isArray(receipt.logs)) {
// decode logs
var events = _.map(receipt.logs, log => {
return _this._parent._decodeEventABI.call({
name: 'ALLEVENT',
jsonInterface: _this._parent.options.jsonInterface,
}, log)
})

// make sure log names keys
receipt.events = {}
var count = 0
events.forEach(ev => {
if (ev.event) {
// if > 1 of the same event, don't overwrite any existing events
if (receipt.events[ev.event]) {
if (Array.isArray(receipt.events[ev.event])) {
receipt.events[ev.event].push(ev)
} else {
receipt.events[ev.event] = [receipt.events[ev.event], ev]
}
} else {
receipt.event[ev.event] = ev
}
} else {
receipt.events[count] = ev
count++
}
})
delete receipt.logs
}
return receipt
},

contractDeployFormatter: receipt => {
var newContract = _this._parent.clone()
newContract.options.address = receipt.contractAddress
return newContract
}
}

var sendTransaction = (new Method({
name: 'sendTransaction',
call: 'eth_sendTransaction',
params: 1,
inputFormatter: [formatters.inputTransactionFormatter],
requestManager: _this._parent.requestManager,
accounts: _this.constructor._ethAccount || _this._ethAccounts, // is eth.accounts,
defaultAccount: _this._parent.defaultAccount,
defaultBlock: _this._parent.defaultBlock,
extraFormatter: extraFormatters,
})).createFunction()

return sendTransaction(args.options, args.callback)
}
}
}
}

Declaration in TypeScript

RxjsV6

animationFrame - scheduler

1
const animationFrame: any;

Perform task when window.requestAnimationFrame would fire.

When animationFrame scheduler is used with delay, it will fall back to async scheduler behaviour.

Without delay, animationFrame scheduler can be used to create smooth browser animations. It makes sure scheduled task will happen just before next browser content repaint, thus performing animations as efficiently as possible.

Content Parser in CITA

The parse now we used is

1
2
3
4
5
6
7
8
9
10
11
const parser = (content: string) => {
const bytes = hexToBytes(content)
const decoded = pb.UnverifiedTransaction.deserializeBinary(bytes)
const tx = decoded.getTransaction()
return {
from: tx.getFrom ? tx.getFrom() : '',
to: tx.getTo ? tx.getTo() : '',
data: tx.getData ? tx.getData() : '',
value: tx.getValue ? tx.getValue().toString() : ''
}
}

Here we use these methods:

  • hexToBytes(content) => bytes

  • pb.UnverifiesTransaction.deserializeBinary(bytes) => decoded

  • decoded.getTransaction() => transaction

  • transaction.{getFrom, getTo, getValue, getData}

hexToBytes
1
2
3
4
5
6
7
8
9
const hexToBytes = (hex: string) => {
let _hex = hex.startsWith('0x') ? hex.slice(2) : hex // remove 0x prefix
let result = []
while (_hex.length >= 2) {
result.push(parseInt(_hex.substring(0, 2), 16)) // parse hex to bytes(int8 array)
_hex = _hex.substring(2, _hex.length)
}
return result
}
pb.UnverifiedTransaction.deserializeBinary
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
const jspb = require('google-protobuf')
const goog = jspb

proto.UnverifiedTransaction = opt_data => {
jspb.Message.initialize(this, opt_data, 0, -1, null, null)
}

proto.UnverifiedTransaction.deserializeBinaryFromReader = (msg, reader) => {
while (reader.nextField()) {
if (reader.isEndGroup()) {
break
}

let field = reader.getFieldNumber()
switch (field) {
case 1: {
let value = new proto.Transaction()
reader.readMessage(value.proto.Transaction.deserializeBinaryFromReader)
msg.setTransaction(value)
}
case 2: {
let value = /** @type {!Uint8Array} */ (reader.readBytes())
msg.setSignature(value)
break
}
case 3: {
let value = /** @type {!proto.Crypto} */ (reader.readEnum())
msg.setCrypto(value)
break
}
default: {
reader.skipField()
break
}
}
}
return msg
}

proto.UnverifiedTransaction.deserializeBinary = bytes => {
const reader = new jspb.BinaryReader(bytes)
const msg = new proto.UnverifiedTransaction()
return proto.UnverifiedTransaction.deserializeBinaryFromReader(msg, reader)
}

GenServer in Erlang

Original

GenServer is essential part of OTP, which simplifies repeating tasks, letting programmer concentrate on logic of the application, and not on handling edge cases and repeated error handling.

Every Time When GenServer callback is called

The idea behind GenServer is simple - you start separate process, that holds some state, then on each incoming message(be that call or cast) it may change it internal state and also generate some response(in case of call)

In this manual calling process is named Alice and newly process is Bob.

Programming without GenServer, as you would done it manually
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
defmodule SimpleGenServerMock do
def start_link() do
# runs in the *caller* context `Alice`
spawn_link(__MODULE__, :init, [])
end

def call(pid, arguments) do
# runs in the *caller* context `Alice`
send pid, {:call, self(), arguments}
receive
{:response, data} -> data
end
end

def cast(pid, arguments) do
# runs in `caller` context `Alice`
send pid, {:cast, arguments}
end

def init() do
# runs in the *server* context `Bob`
initial_state = 1
loop(initial_state)
end

def loop(state) do
# runs in the *server* context `Bob`

receive command do
{:call, pid, :get_data} ->
# do some work on data here and update state
{new_state, data} = {state, state}
send pid, {:response, data}
loop(new_state)
{:cast, :increment} ->
# do some work on data here and update state
new_state = state + 1
loop(new_state)
end
end
end

Code initial_state = 1 is exactly same code we write in init callback. Internal state of the server is simply an integer. Usually it is a map, tuple or list with settings and state.

{state, state} means that we do not want to update the state and want to return state as result. This is the code which goes in handle_call callback in Bob.

And code new_state = state + 1 is the code which goes into handle_cast callback, because we do not need to respond with result, we just change server Bob internal state.

Working with module will look like:

1
2
3
4
5
6
pid = SimpleGenServerMock.start_link()
counter = SimpleGenServerMock.call(pid, :get_data)
IO.puts "Counter: #{counter}"
SimpleGenServerMock.cast(pid, :increment)
counter = SimpleGenServerMock.call(pid, :get_data)
IO.puts "Counter: #{counter}"
Same Server With GenServer Behaviour

Now if we want to re-write same code using GenServer it will look like this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
defmodule SimpleGenServerBehaviour do
use GenServer

def start_link() do
# runs in the *caller* context `Alice`
GenServer.start_link(__MODULE__, [])
end

def init(_) do
# runs in the *server* context `Bob`
{:ok, 1}
end

def handle_call(:get_data, _from, state) do
# runs in the *server* context `Bob`
{:reply, state, state}
end

def handle_cast(:increment, state) do
# runs in the *server* context `Bob`
{:noreply, state + 1}
end
end

While in this example it did not saved a lot of lines for more complicated code having GenServer deal with all complexity saves a lot of tying. Also you got timeout, named processes and stable, production proven error hanlding for free.

Using GenServer behaviour is very similar to code we wrote before:

1
2
3
{:ok, pid} = GenServer.start_link(SimpleGenServerBehaviour, [])
counter = GenServer.call(SimpleGenServerBehaviour, :get_data)
GenServer.cast(SimpleGenServerBehaviour, :increment)

Better to implement start in the module

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
defmodule Stack do
use GenServer

def start_link(defaut) do
GenServer.start_link(default)
end

def push(pid, item) do
GenServer.cast(pid, {:push, item})
end

def pop(pid) do
GenServer.call(pid, :pop)
end

# Server callbacks

@impl ture
def handle_call(:pop, _from, [h | t]) do
{:reply, h, t}
end

@impl true
def handle_cast({:push, item}, state) do
{:noreply, [item | state]}
end
end
Receiving Regular Messages

The goal of GenServer is to abstract the “receive” loop for developers, automatically handling system messages, support code changes, synchronous calls and more. Therefore, you should never call your own “recieve” inside the GenServer callbacks as doing will cause the GenServer misbehave.

Besides the synchronous and asynchronous communication provided by call/3 and cast/2, regular messages sent by functions such as Kernal/send2, Process.send_after/4 and similar, can be handled inside the handle_info/2 callback.

handle_info/2 can be used in many sinutations, such as handling monitor DOWN messages sent by Process.monitor/1. Another use case for handle_info/2 is to perform periodic work, with the help of Process.send_after/4:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
defmodule MyApp.Periodically do
use GenServer

def start_link do
GenServer.start_link(__MODULE__, %{})
end

@impl true
def init(state) do
schedule_work() # Schedule work to be performed on start
{:ok, state}
end

@impl true
def handle_info(:work, state) do
# Do the desired work here
schedule_work() # Reschedule once more
{:noreply, state}
end

defp schedule_work() do
Process.send_after(self(), :work, 1000)
end
end

Association in Phoenix

Original

Associations

Associations in Ecto are used when two difference sources(tables) are linked via foreign keys.

A classic example of this setup is “Post has many comments”. First create the two tables in migrations

1
2
3
4
5
6
7
8
9
10
11
12
13
create table(:posts) do
add :title, :string
add :body, :text

timestamps()
end

create table(:comments) do
add :post_id, references(:posts)
add :body, :text

timestamps()
end

Each comment contains a post_id column that by default points to a post id

And now defined the schemas

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
defmodule MyApp.Post do
use Ecto.Schema

schema "posts" do
field :title
field :body
has_many :comments, MyApp.Comment
timestamps
end
end

defmodule MyApp.Comment do
use Ecto.Schema

schema "comments" do
field :body
belongs_to :post, MyApp.Post
timestamps
end
end

Querying associations

One of the benefits of defining associations is taht they can be used in queries. For example:

1
Repo.all from p in Post, preload: [:comments]

Now all posts will be fetched from the database with their associated comments. The example above will perform two queries: one for loading all posts and another for loading all comments. This is often the most efficient way of loading associations from the database(even if two queries are performed) because we need to receive and parse only POSTS + COMMENTS results.

It is also possible to preload associations using joins while performing more complex queries. For example, imagine both posts and comments have votes and you want only comments with more votes than the post itself:

1
2
3
4
Repo.all from p in Post,
join: c in assoc(p, :comments),
where: c.votes > p.votes
preload: [comments: c]

Manipulating associations

While Ecto 2.0 allows you insert a post with multiple comments in one operation:

1
2
3
4
5
6
7
Repo.insert!(%Post{
title: "Hello",
body: "world",
comments: [
%Comment{body: "Excellent"}
]
})

Many times you may want to break it into distinct steps so you have more flexibility in managing those entries. For example, you could use changesets to build your posts and comments along the way

1
2
3
4
post = Ecto.Changeset.change(%Post{}, title: "Hello", body: "World")
comment = Ecto.Changeset.change(%Comment{}, body: "Excellent")
post_with_comments = Ecto.Changeset.put_assoc(post, :comments, [comment]) # Main Step
Repo.insert!(post_with_comments)

Or by handling each entry individually inside a transation:

1
2
3
4
5
6
Repo.transaction fn ->
post = Repo.insert!(%Post{title: "Hello", body: "World"})
# Build a comment from the post struct
comment = Ecto.build_assoc(post, :comments, body: "Execellent")
Repo.insert!(comment)
end

Ecto.build_assoc/3 builds the comment using the id currently set in the post struct. It is equivalnet to:

1
%Comment{post_id: post.id, body: "Execellent"}

The Ecto.build_assoc/3 function is specially useful in Phoenix controllers. For example when creating the post, one would do:

1
Ecto.build_assoc(current_user, :post)

As we likely want to associate the post to the user currently signed in the application. In another controller, we could build a comment for an existing post with:

1
Ecto.build_assoc(post, :comments)

Ecto does not provide functions like post.comments << comment that allows mixing persisted data with non-persisted data. The only macha

Difference Bwtween Build_assoc, Put_assoc, and Cast_assoc

Cast Assoc code

cast_assoc(changeset, name, opts \ [])

Casts the given association with the changeset params

This function should be used when working withe the entire association at once(and not a single element of a many-style association) and using data external to the application.

When updating the data, this function requires the association to have been preloaded in the changeset struct. Missing data will invoke the :on_replace behaviour defined on the association. Preloading is not necessary for newly built structs.

The parameters for the given association will be retrieved from changeset.params. Those parameters are expected to be a map with attributes, similar to the ones passed to cast/4. Once parameter are retrieved, cast_assoc/3 will match those parameters with the associations already in the changeset record

For example, imagine a user has many addresses relationship where post data is sent as follow

1
2
3
4
%{"name" => "John Doe", "addresses" => [
%{"street" => "Some where", "country" => "Brzil", "id" => 1},
%{"street" => "Else where", "country" => "Poland"},
]}

and then

1
2
3
4
user
|> Repo.preload(:addresses)
|> Ecto.Changeset.cast(params, [])
|> Ecto.Changeset.cast_assoc(:addresses)

Once cast_assoc/3 is called, Ecto will compare those params with the addresses already associated with the user and acts as follows:

  • If the parameter does not contain an ID, the parameter data will be passed to changeset/2 with a new struct and become an insert operation

  • If the parameter contains an ID and there is no associated child with such ID, the parameter data will be passed to changeset/2 with a new struct and become an insert operation

  • If the parameter contains an ID and there is an associated children with such ID, the parameter data will be passed to changeset/2 with the existing struct and become an update operation.

  • If there is an associated child with an ID and its ID is not given as parameter, the :on_replace callback for that association will be invoked

Code
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
`cast_assoc/3` is useful when the associated data is managed alongside the parent struct, all at once.

To work with a single element of an association, other functions are more appropriate. For example to insert a single associated struct for a `has_many` association it's much easier to construct the associated struct with `Ecto.build_assoc/3` and persist it directly with `Ecto.Repo.insert/2`

Furthurmore, if each side of the association is managed seperately, it is prefereable to use `put_assoc/3` and directly instruct Ecto how the association should look like.

For example, imagine you are receiving a set of tags you want to associate to an user. Those tags are meant to **exist upfront**. Using `cast_assoc/3` won't work as desired because the tags are not managed alongside the user. In such cases, `put_assoc/3` will work as desired. With the given parameters:

%{"name" => "John Doe", "tags" => ["linear"]}

and then

tags = Repo.all(from t in Tag, where: t.name in ^params["tags"])

user
|> Repo.preload(:tags)
|> Ecto.Changeset.cast(params) # no need to allow :tags as we put them directly
|> Ecto.Changeset.put_assoc(:tags, tags) # explicitly set the tags

note the changeset must have been previously `cast` using `cast/4` before this function is called.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
def cast_assoc(changeset, name, opts \\ []) when is_atom(name) do
cast_relation(:assoc, changeset, name, opts)
end

defp cast_relation(type, %Changeset{} = changeset, key, opts) do
{ key, param_key } = cast_key(key)
%{data: data, types: types, params: params, changes, changes } = changeset
%{related: related} = relation = relation!(:cast, type, key, Map.get(types, key))
params = params || %{}

{changeset, required?} = if opts[:required] do
{update_in(changeset.required, &[key|&1]), true}
else
{changeset, false}
end

on_cast = Keyword.get_lazy(opts, :with, fn -> on_cast_default(type, related) end)
original = Map.get(data, key)

changeset = case Map.fetch(params, params_key) do
{:ok, value} ->
current = Relation.load!(data, original)
case Relation.cast(relation, value, current, on_cast) do
{:ok, change, relation_valid?} when change != original ->
missing_relation(%{changeset | change: Map.put(changes, key, changes), valid?: changeset.valid? and relation_valid?}, key, current, required?, relation, opts)
:error ->
%{changeset | errors: [{key, {message(opts, :invalid_message, "is invalid"), [type: expected_relation_type(relation)]}} | changeset.errors], valid? false}
_ -> missing_relation(changeset, key, current, required?, relation, opts)
end
:error -> missing_relation(changeset, key, current, required?, relation, opts)
end

update_in changeset.types[key], fn {type, relation} ->
{type, %{relation | on_cast: on_cast}}
end
end

Put Assoc

As alternative to cast_assoc/3

cast_assoc/3 is useful when the associated data is managed alongside the parent struct, all at once.

If each side of the association is managed seperately, it is preferable to use put_assoc/3 and directly instruct Ecto how the association should look like.

For example, imagine you are receiving a set of tags you want to associate to an user. Those tags are meant to exist upfront. Using cast_assoc/3 won’t work as desired because the tags are not managed alongside the user. In such cases, put_assoc/3 will work as desired.

1
%{"name" => "John Doe", "tags" => ["lieanr"]}

and then:

1
2
3
4
5
6
tags = Repo.all(from t in Tag, where: t.name in ^params["tags"])

user
|> Repo.preload(:tags)
|> Ecto.Changeset.cast(params)
|> Ecto.Changeset.put_assoc(:tags, tags)

Example

Build_Assoc

Gen an instance with foreign key

1
2
3
4
5
6
iex> post = Ecto.build_assoc(user, :posts, %{header: "Clickbait header", body: "No real contet"})
%EctoAssoc.Post{__meta__: #Ecto.Schema.Metadata<:built, "posts">,
body: "No real content", header: "Clickbait header", id: nil,
user: #Ecto.Association.NotLoaded<association :user is not loaded>, user_id: 1}

iex> Repo.insert!(post)
Put_Assoc

Add association to changeset which is not persisted

1
2
3
iex> post_changeset = Ecto.Changeset.change(post)
iex> post_with_tags = Ecto.Changeset.put_assoc(post_changeset, :tags, [misc_tag])
iex> Repo.insert!(post_with_tags)

Conclusion

cast_assoc when you want to cast external parameters, like the ones from a form, into an association.

put_assoc when you already have an association struct

build_assoc receive an existing struct(for example user), that was persisted to the database, and builds a struct(for example post, based on its association(for example :posts), with the foreign key field(for example user_id).

Introduction to Object.getOwnPropertyDescriptors

Object.getOwnPropertyDescriptors

This method returns all properties including getter and setter.

Object.assign shallow copies all the properties excluding getter and setter of the original source object.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
const car = {
name: 'BMW',
price: 100000,
get discount () {
return this.d
}
set discount (x) {
this.d = x
}
}

Object.getOwnPropertyDescriptor(car, 'discount') // {get, set, enumerable, configurable}

const assignedCar = Object.assign({}, car)
Object.getOwnPropertyDescriptor(assignedCar, 'discount') // {value: undefined}

const definedCar = Object.defineProperties({}, Object.getOwnPropertyDescriptors(car))
Object.getOwnPropertyDescriptor(definedCar, 'discount') // {get, set, enumerable, configurable}

Enable Sass/scss in Phoenix

Step 1: Add brunch

1
cd assets && yarn add sass-brunch

Step 2: Add plugin in brunch-config.js

1
2
3
4
5
6
// brunch-config.js
plugins: {
sass: {
mode: 'native',
},
}

Step3: Enable hot-reload

1
2
3
4
5
6
7
8
9
10
11
// dev.exs
// add sass and scss to the pattern
config :citaDappStore, CitaDappStoreWeb.Endpoint,
live_reload: [
patterns: [
~r{priv/static/.*(js|css|png|jpeg|jpg|gif|svg|sass|scss)$},
~r{priv/gettext/.*(po)$},
~r{lib/citaDappStore_web/views/.*(ex)$},
~r{lib/citaDappStore_web/templates/.*(eex)$}
]
]

That’s all.

Model Generator of Rails

Origin

Basic Usage

1
rails g model User email

This command will generate user model with email field type of string, migration which creates user table, test for model and factory.

If you want to have model with different type of string pass type after field name following by :.

The whole list of available types:

1
2
3
4
5
6
7
8
9
10
11
12
integer
primary_key
decimal
float
boolean
binary
string
text
date
time
datetime
timestamp

You are able to pass -option parameter to generator. It will inherit generating class from passed name to achieve STI(Single Table Inheritance):

1
rails g model admin --parent user

This example generates model:

1
2
class Admin < User
end

Interesting fact that if you generate model in some scope passing model like admin/user of Admin::User:

1
rails g model admin/user

you will get generated model in scope app/models/admin/user.rb, defined scope app/models/admin.rb which is required to define module.

1
2
3
4
5
module Admin
def self.table_name_prefix
'admin_'
end
end

It means that generated table name for Admin::User starts with prefix _admin_user_. This feature allows to have seperated namespaced models as in rails code as in db schema.

Advantage useage

Sometimes you have to automatically add index for columns in your migration. It’s not a problem.

1
rails g model user email:index location_id:integer:index

Or uniq index

1
rails g model user pseudo:string:uniq

Set limit for field of integer, string, text and binary fields

1
rails g model product 'price:decimal{10,2}'

The last useful feature of generators - it’s options to generate reference columns (fields which are used in rails as foreign_key)

1
rails g model photo album:references

This command will generate photos table with integer field _album_id_ and also it will add index for this field automatically.

1
2
3
4
5
6
7
8
9
10
11
class CreatePhotos < ActiveRecord::Migration
def change
create_table :photos do |t|
t.references :album

t.timestamps
end

add_index :photos, :album_id
end
end

Error Handling in Solidity

Solidity uses state-reverting exceptions to handle errors. Such an exception will undo all changes made to the state in the current call(and all its sub-calls) and also flag an error to the caller. The convenience function assert and require can be used to check for conditions and throw an exception if the condition is not met.

The assert function should only be used to test for internal errors, and to check invariants.

The require function should be used to ensure valid conditions, such as inputs, or contract state variables are met, or to validate return values from calls to external contracts.

If used properly, analysis tools can evaluate your contract to identify the conditions and function calls which will reach a failing assert. Properly functioning code should never reach a failing assert statement. If this happens there is a bug in your contract which you should fix.

There are two ways to trigger exceptions:

  • The revert function can be used to flag an error and revert the current call. In the future, it might be possible to also include details about error in a call to revert.

  • The throw keyword can also be used as an alternative to revert()

From 0.4.13, the throw is deprecated.

When exceptions happen in a sub-call, they ‘bubble up’ automatically. Exceptions to this rule are send and the low-level functions call, delegatecall and callcode – those return false in case of an exception instead of ‘bubble up’.

low-level function call, delegatecall, callcode return false when exceptions occur.

Catch exceptions is not yet possible.

Start in Drizzle

Drizzle is a collection of front-end libraries that make wirting dapp front-ends easier and more predicable. The core of Drizzle is based on Redux Store, so you can access to the spectacular development tools around Redux. We take care of synchronizing your contract data, transaction data and more. Things stay fast because you declare what to keep in sync.

  • Fully reactive contract data, including state, events and transactions

  • Declarative, so you’re not wasting valuable cycles on unneeded data.

  • Maintainer access to underlying functionaliy. Web3 and your contract’s methods are still there, untouched.

Installation

1
yarn add drizzle

If use React you can use drizzle-react and (optionally) its companion drizzle-react-components

Drizzle uses web3 1.0 and web sockets, be sure your development environment can support these.

  1. Import the provider
1
import { Drizzle, generateStore } from 'drizzle'
  1. Create an options object and pass in the desired contract artifacts for Drizzle to instantiate. Other options are available, just keep going on.
1
2
3
4
5
6
7
8
9
10
11
12
// import contract
import SimpleStorage from './../build/contracts/SimpleStorage.json'
import TutorialToken from './../build/contracts/TutorialToken.json'

const options = {
contracts: [
SimpleStorage
]
}

const drizzleStore = generateStore(this.props.options)
const drizzle = new Drizzle(this.props.options, drizzleStore)

Contract Interaction

Drizzle provides helpful methods on top of the default web3 Contract methods to keep you calls and transactions in sync with the store.

cacheCall()

Gets contract data. Calling the cacheCall() function on a contract will execute the desired call and return a corresponding key so the data can be retrieved from the store.

When a new block is received, Drizzle will refresh the store automatically _if_ any transactions in the block touched our contract.

Note: We have to check that Drizzle is initialized before fetching data. A simple if statement such as below is fine for display a few pieces of data, but a better approach for larger dapps is to use a loading component. Drizzle built one in drizzle-react-component as well.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
// Assuming we're observing the store for changes
var state = drizzle.store.getState()

// If Drizzle is initialized (and therefore web3, accounts, and contracts ), continue
if (state.drizzleStatus.initialized) {
// Declare this call to be cached and synchornized. We'll receive the store key for recall
const dataKey = drizzle.contracts.SimpleStorage.methods.storedData.cacheCall()

// Use the dataKey to display data from the store
return state.contracts.SimpleStorage.methods.storedData[dataKey].value
}

// If Drizzle isn't initialized, display some loading indication
return 'loading'

The Contract instance has all of its standard web3 properties and methods. For example, you could still call as normal if you don’t want something in the store:

1
drizzle.contracts.SimpleStorage.methods.storedData().call() // different from methods.storedData.cacheCall()

cacheSend()

Sends a contract transaction. Calling the cacheSend() function on a contract will send the desired transaction and return a correnponding hash so the status can be retrieved from the store. The last argument can optionally be an options object with the typical from, gas, gasPrice keys. Drizzle will update the transaction’s state in the store(pending, success, error) and store the transaction receipt.

Note: We have to check that Drizzle is initialized before fetching data. A simple if statement such as below is fine for display a few pieces of data, but a better approach for larger dapps is to use a loading component.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
// Assuming we're observing the store of changes
var state = drizzle.store.getState()

// if Drizzle is initialized ( and therefore web3, accounts, and contracts ), continue
if (state.drizzleStatus.initialized) {
// Declare this transaction to be observed. We'll receive the stackId of reference.
const stackId = drizzle.contracts.SimpleStorage.methods.set.cacacheSend(2, {from: '0x3f...'})

// Use the dataKey to display the transaction status
if (state.transactionStack[stackId]) {
const txHash = state.trasnactionStack[stackId]

return state.transactions[txHash].status
}
}

// If Drizzle isn't initialized, display some loading indication.
return 'loading'

The contract instance has all of its standard web3 properties and methods.

1
drizzle.contracts.SimpleStorage.methods.set(2).send({from: '0x3f...'})

Options

1
2
3
4
5
6
7
8
9
10
11
12
13
14
{
contracts,
events: {
contractsName: [
eventName
]
},
web3: {
fallback: {
type
url
}
}
}
  • Contracts: Array, Required, an array of contract artifacts files

  • Events: Object, an object consisting of contract names each containing an array of strings of the event names we’d like to listen for and sync with the store

  • Web3: Object, options regarding web3 instantiation

  • Fallback: Object, an object consisting of the type and url of a fallback web3 provider. This is used if no injected provider, such as MetaMask or Mist, is detect.

    • type: string, the type of web3 fallback, currently ws is the only possibility

    • url: string, the full websocket url. For example, ws://127.0.0.1:8546

How data stays fresh

  1. Once initialized, Drizzle instantiates web3 and our desired contracts, then observes the chain by subscribing to new block headers

  1. Drizzle keeps track of contract calls so it knows what to synchronize

  1. When a new block header comes in, Drizzle checks that the block isn’t pending, then goes through the transactions looking to see if any of them touched our contracts

  1. If they did, we replay the calls already in the store to refresh any potentially altered data. If they didn’t we continue with the store data.

Constant, View, and Pure in Solidity

Summary

  • pure for functions: Disallows modifition or access of state - this is not enforeced yet.

  • view for functions: Disallow modifition of state - this is not enforced yet.

  • payable for functions: Allows them to receive Ether together with a call.

  • constant for state variables: Disallow assignment (except initialization), does not occupy storage slot.

  • anonymouse for events: Does not store event signature as topic(indexable).

  • indexed for event parameters: Stores the parameter as topic(indexable).

Question

Q: Solidity 0.4.16 introduced the view and constant function modifiers. The documentation says:

constant for functions: Same as view

Does this mean view is just an alias for constant?

Answer: This is discussed here

  1. The keyword view is introduced for functions (it replaces constant). Calling a view cannot alter the behavior of future interaction with any contract. This means such functions cannot use SSTORE, cannot send or receive ether and can only call other view or pure functions.

  2. The keyword pure is introduced for functions, they are view functions with the additional restriction that their value only depends on the function arguments(pure function). This means they cannot use SSTORE, SLOAD, cannot send or receive ether, cannot use msg or block and can only call other pure function.

  3. The keyword constant is invalid on functions

  4. The keyword constant on any variable means it cannot be modified (and could be placed into memory or bytecode by the optimiser)

Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×