🟣 US_stock_bundle

#########################################################################
# US_stock_bundle
###############

# ~.zipline/extensions.py

#from zipline.data.bundles import register, US_stock_data
#
#register('US_stock_bundle', US_stock_data.US_stock_data, calendar_name='NYSE')

#########################################################################


import pandas as pd
from os import listdir

# Change the path to where you have your data
try:
  import google.colab
  # for COLAB
  path = '/conte

🟣 TA-LIB, ZIPLINE & US_stock_bundle

from google.colab import drive
drive.mount('/content/drive')
###################################################################################################
!pip install /content/drive/MyDrive/WheelFiles/TA_Lib-0.4.26-cp310-cp310-linux_x86_64.whl
!cp /content/drive/MyDrive/WheelFiles/libta_lib.so.0 /usr/lib
!pip install /content/drive/MyDrive/WheelFiles/zipline_reloaded-0.0.0-cp310-cp310-linux_x86_64.whl
########################################################################################

1367. Linked List in Binary Tree

Given a binary tree root and a linked list with head as the first node. Return True if all the elements in the linked list starting from the head correspond to some downward path connected in the binary tree otherwise return False. In this context downward path means a path that starts at some node and goes downwards.
/**
 * Definition for singly-linked list.
 * function ListNode(val, next) {
 *     this.val = (val===undefined ? 0 : val)
 *     this.next = (next===undefined ? null : next)
 * }
 */
/**
 * Definition for a binary tree node.
 * function TreeNode(val, left, right) {
 *     this.val = (val===undefined ? 0 : val)
 *     this.left = (left===undefined ? null : left)
 *     this.right = (right===undefined ? null : right)
 * }
 */
/**
 * @param {ListNode} head
 * @param {TreeNode} root
 * @return {bool

social share movies

Social Movies
```php
<div class="social-bar">
    <a href="https://www.facebook.com/sharer/sharer.php?u=<?php the_permalink(); ?>" target="_blank" class="social-facebook">
        <svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 320 512"><path d="M279.14 288l14.22-92.66h-88.91v-60.13c0-25.35 12.42-50.06 52.24-50.06h40.42V6.26S260.43 0 225.36 0c-73.22 0-121.08 44.38-121.08 124.72v70.62H22.89V288h81.39v224h100.17V288z"/></svg>
    </a>
    <a target="_blank" href="https://twitter.com/intent/tw

LLM - PyTorch - .squeeze()

The `.squeeze()` function in PyTorch is used to **remove any dimensions of size 1** from a tensor. In the context of your code:

```python
tokens['labels'] = labels['input_ids'].squeeze()  # Remove extra dimension
```

### **Why Use `.squeeze()`?**

1. **Tensor Shapes After Tokenization**:
   - When you use `return_tensors="pt"` in Hugging Face’s tokenizer, it returns tensors with a batch dimension, even if you’re processing a single example.
   - For instance, after tokenization, `lab

LLM - Mixed Precision Training VS Quantization

**Mixed Precision Training** and **Quantization** are related concepts but serve different purposes in model training and inference. Let’s break down both:

### **Mixed Precision Training**:
- **Goal**: The goal of mixed precision training is to **speed up training** and **reduce memory usage** without significantly affecting model performance.
- **How it works**: Mixed precision training involves using both **16-bit floating point (FP16)** and **32-bit floating point (FP32)** operations dur

LLM - FINE-TUNING - OOM Error - 1ST Measures

The error you're encountering (`OutOfMemoryError`) means that your **GPU's memory** is running out when trying to load the model, data, or during the initialization of the `Trainer` class. In your case, it tried to allocate an additional 788 MB of memory, but your GPU has only 92.88 MB left.

Here’s an educational breakdown of what’s happening and steps to fix it:

### **Why This Happens:**
1. **GPU Memory Usage**: When you load large models like GPT-J (which has 6 billion parameters), it c

LLM - FINE-TUNING - OOM Error - 2ND Measures

If the **Out of Memory (OOM)** error persists even after applying mixed precision training and reducing the batch size, there are a few more techniques you can try to manage GPU memory more effectively. Let's go through them one by one.

### **1. Clear the GPU Memory Before Initializing the Trainer**:
Sometimes, previous operations (like loading the model) can leave residual data in the GPU memory, causing an overflow when you try to initialize the `Trainer`. You can clear the GPU memory manu

3217. Delete Nodes From Linked List Present in Array

You are given an array of integers nums and the head of a linked list. Return the head of the modified linked list after removing all nodes from the linked list that have a value that exists in nums.
/**
 * Definition for singly-linked list.
 * function ListNode(val, next) {
 *     this.val = (val===undefined ? 0 : val)
 *     this.next = (next===undefined ? null : next)
 * }
 */
/**
 * @param {number[]} nums
 * @param {ListNode} head
 * @return {ListNode}
 */
var modifiedList = function(nums, head) {
    // Create a set from the nums array for O(1) look-up time
    const numSet = new Set(nums);

    // Create a dummy node to handle edge cases where the head needs to be removed
    let dummy

wifi hotspot

iw list | grep -A 10 "Supported interface modes"
sudo nmcli dev wifi hotspot ifname wlp1s0 ssid "ACE 2.4G 1st Floor" password "server***1###alt"
sudo nmcli connection show
sudo nmcli connection down Hotspot

Playwright Tracing

val pageContext = buildUesBrowserContext(...)

//Creating tracing
pageContext.tracing().start(Tracing.StartOptions().setScreenshots(true).setSnapshots(true))
val page = pageContext.newPage()

try {
  ...
} finally {
    pageContext.tracing().stop(Tracing.StopOptions().setPath(Path("C:\\My\\Path\\log\\test.zip")))
}

// Then load the zip file to https://trace.playwright.dev/

log formData in console

let postData = new FormData(form);
postData.keys().forEach(
    (key) => {
        console.log(key);
        console.log(postData.get(key));
    }
);

TRANSFORMERS - Queries, Keys and Values - Analogy

Let’s break down the concepts of **Queries**, **Keys**, and **Values** in a way 
that’s easy to follow,
- first using an **analogy**
- then showing how they work in a **practical example**.


### **Analogy: Finding a Book in a Library**

Imagine you're in a library, and you're looking for a book about "dogs." Here's how you would do it:

1. **Query**: This is your search request. 
In this case, it’s “*I want books about dogs*.” That’s your **query**: **what you’re looking for**.
   

TRANSFORMERS - ATTENTION - Global Analogy

To explain attention mechanisms in Transformers simply, think of it like this:

Imagine you're reading a book. When you read each word, your brain doesn’t just 
focus on that word alone — it relates it to previous words and keeps the context 
in mind. **Some words or sentences are more important than others in understanding the meaning 
of a passage, so your brain pays more “attention” to those important parts**.

In Transformers, attention mechanisms do something similar. Instead of read

Python FastAPI w/ Express Webhooks Setup

### Setup
* Set up the ngrok config file to use 2 servers
* `ngrok start` to run them both

---

### Webhooks (Express App)
* Ensure `.env` file has up to date credentials
* If referring to the dynacomAPI, make sure that url is set (will be dynamic,
so ensure it is up to date)

### API (Python)
* `local.settings.json`
```
    "FUNCTIONS_WORKER_RUNTIME": "python",
    "AzureWebJobsStorage": "UseDevelopmentStorage=true",
    "AzureWebJobsFeatureFlags": "EnableWorkerIndexing",
```
* ensure settings

Ngrok Configurations

### NGROK Set up 2 servers
* run `ngrok config check` to check the location of the config file
* edit it with the following:
```
version: "2"
authtoken: 2YJjpBQ8t2xvdDsUIp88JVV18aY_2BdjVXeg7vxiDoMM1xurX

tunnels:
  first:
    addr: 3000
    hostname: pangolin-enough-mammoth.ngrok-free.app
    proto: http
  second:
    addr: 7071
    proto: http
```
* first 2 lines are default
* use `first` and `second` to add servers
  * the first one can use a hostname if account created (or paid)
  * second on