Promises are used to make your code asynchronous. The idea is to run the promise code after something has completed without blocking execution of the rest of the code. This can also be accomplished with callbacks, but promises make the code easier to read and offer a few additional benefits.
Learn more about promises from MDN.
Relational databases contain data organized in tables made of rows and columns, with each table column containing a specific data type. Transactions support ACID (Atomic, Consistent, Isolated, Durable) properties, assuring that all transaction are processed reliably.
Atomic means that if any part of the transaction fails, the entire transaction fails. All data is brought back to its state prior to the failed transaction.
Consistent means that everything written to the database must be valid according to all defined rules including constraints, cascades, and triggers.
Isolated means that the effects of a transaction are not visible to other transactions until the transaction is complete.
Durable means that all committed transactions are stored permanently and will persist even if there is a loss of power or the database crashes.
An algorithm’s time complexity if often expressed using big O notation. Below are some common types of algorithmic time complexity.
O(1) — With constant time complexity, as the input size grows, the number of steps remains the same. This is the most desirable type of time complexity. Looking up an item at a specific index of an array is an example of constant time. No matter how large the array, the lookup is a single operation.
O(log n) — With logarithmic time complexity, the number of steps increase as the input size increases, but at a slower rate as the size grows larger and larger. Binary search is an example of logarithmic time because we can divide the problem in half with each iteration. As we double the input size, only one additional step is required.
O(n) — With linear time complexity, the size of the input and the number of steps grow proportionally. Iterating through an array is an example of linear time because each additional item in the array requires one additional iteration.
O(n^2) — With quadratic time complexity, the number of steps is proportional to the square of the input size. An example of quadratic time is when we have a loop inside of a loop. If we iterate through an array of items, and on each iteration we iterate through all items in the array again, that would be quadratic time complexity. A loop inside a loop inside a loop would be O(n^3).
O(C^n) — With exponential time complexity, the number of steps increases exponentially with each increase of input size. This might be fine for very small input sizes, but as input size grows larger and larger, the time complexity may quickly become unacceptable. An example of exponential time would be a recursive solution to find the nth fibonacci number.