json rpc – How to use createrawtransaction for sending to bare multisig?

There are lot of tools for performing pay to script has transactions, but how to build crearawtransaction parameters for sending to bare multisig.

I m meaning, what are the exact steps to get then select and insert utxo?
Then, how do I compute the txid field?
What s the syntax of the output script for a 2 of 1 transaction? Then, how do I convert such text to bytecode?

A step by step example detailing those part would be welcome.

Reading a JSON file (Rust)

I’m trying to read a config file in JSON. As usual, after a bout of compiler errors, this read the config file successfully. I can’t shake the feeling, however, that this is extraordinarily bad code:

    let mut config_file = String::new();

    match fs::read_to_string("config.json") {
        Ok(v) => {
            config_file = v;
            println!("Successfully read config");
        }
        Err(_) => println!("Encountered error while reading config..."),
    }

    let config = json::parse(&config_file).unwrap();

index – Does PostgreSQL support wildcard indexes on JSON?

I have a multi-tenant service with a table like this:

project_id | user_id | user_properties

Each project belongs to a different customer and customers can freely attach metadata to their users. A project may have millions of users.

Then a customer may want to find some users inside his project filtering with the user_properties (e.g. age grater than X, favorite music equal to Y, etc.)

user_properties can be an arbitrary json of key-value pairs and a customer can run arbitrary queries on the user_properties. The json is not nested (only key-value pairs).

Since a query may return many results it would also be useful to use some sort of pagination (e.g. order by user_id + limit). But pagination, together with arbitrary filters, seems an additional issue for performance…

Is it possible to handle that case in PostgreSQL? Is EAV the only solution?

I see that MongoDB supports wildcard indexes: does PostgreSQL offer anything similar?

Turning existing PHP SQL queries into an JSON API service in Node (or other)

I was handed a prototype of a website written in PHP, with SQL queries injected in each respective PHP file. As an example: the “profile.php” that displays the user’s profile would run certain queries, save them as variables, then use them in HTML (in the same file).

if ($id > 0) {

    $query = "SELECT users.*, (select usertypes.name from usertypes where usertypes.id=users.type) as usertypesname FROM users WHERE id=" .$id;

    $stmt = $db->query($query);

    while($row = $stmt->fetch(PDO::FETCH_ASSOC)) {
        $typeid = $row('type');
        $typename = $row('usertypesname');
        $email = $row('email');
        $firstname = $row('firstname');
        $lastname = $row('lastname');
        $company = $row('company');
        $vat = $row('vat');
        $city = $row('city');
        $address = $row('address');
        $zip = $row('zip');
        $phone = $row('phone');
        $description = $row('description'); 
        $deliveries = $row('deliveries');
        $tags = $row('tags');
        $image = $row('image');

    }
}

And then those respective variables are just inserted in the HTML body.

The goal is to build a React Native application out of this website and from my understanding you should definitely use an API with JSON objects for requests/responses. My idea was to re-use these same SQL queries in Node and make page controllers that return JSON strings, essentially making my own API.

My question is, is there any better way to do this? Implementation hasn’t started yet – so I’m very thankful for any ideas.

Estoy intentando recibir un Json con varios registros ingresados ,

@PostMapping("/crearPersonas")
public ResponseEntity <?> crearPersonas(@RequestBody CrearPersonaDto personaDto){
    
    ResponseDto response = new ResponseDto();
    response.setCodigo(Codigo.OK);
    response.setDescripcion("Personas insertadas correctamente");
    
    try {
        response.setData(personaService.insertarPersonas(personaDto).getData());
    } catch (Exception e) {
        response.setCodigo(Codigo.NO_OK);
        response.setDescripcion(e.getMessage());
        return new ResponseEntity<>(response, HttpStatus.INTERNAL_SERVER_ERROR);
    }
    
    return new ResponseEntity<>(response, HttpStatus.OK);
}

json – Añadir filas a Exceljs con React

Comunidad, saludos.
Soy bastante novata en React y Nodejs. Estoy realizando una app para exportar un JSON a Excel usando ExcelJS. El JSON está anidado y lo recupero así:

const objectsFromCols = () =>{
    const valuesToExport=();
    rutas.forEach((ruta) => {
        let rutaExport={
            Ruta:ruta.label,
            Derrotero:ruta.value.descripcion,
            Inicio:ruta.value.inicio,
            Fin:ruta.value.fin,
            Ciudad:ciudad.nombre
        }
        valuesToExport.push(rutaExport);
    });
    return valuesToExport;
}

Y utilizo lo siguiente para llenar el Excel, tomado del siguiente repositorio
getting-started-with-exceljs
:

    rutas.forEach((e, index) => {
     // row 1 is the header.
    const rowIndex = index + 2

    worksheet.addRow({
        ...e
        })
    });

Sin embargo, al momento de guardar el Excel, sólo me imprime los headers. ¿Alguien podría ayudarme con eso? Saludos y gracias de antemano.

extract json vlues from R dataframe

After lot of research I cannot find a solution on extracting json values from a dataframe in R without knowing the keys

I have a dataframe named as test which has one vector common (only having json values) and one vector as id. The records are in millions

Example

id  common
1   {ename=pageload, pgloc={from=https://m.amazon.com/gift/popular-aloe-vera-plant?gclid=CjwKCAjw8MD7BRArEiwAGZsrBZh6cWJ1-PGvFC1zMutwfjBJuGROHhW4l_ZtcH3n2ZvPSotsTO-sgxoCucAQAvD_BwE, to=https://m.amazon.com/gift/popular-aloe-vera-plant?gclid=CjwKCAjw8MD7BRArEiwAGZsrBZh6cWJ1-PGvFC1zMutwfjBJuGROHhW4l_ZtcH3n2ZvPSotsTO-sgxoCucAQAvD_BwE#/product-page, clikd=}, dev={ver=1.0, blang=en-GB, ip=27.5.192.167, dtype=Mobile, ua=Mozilla/5.0 (Linux; Android 10; HD1901) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/85.0.4183.127 Mobile Safari/537.36, did=bc0a2740-d89e-11ea-b35c-567d21f9cbe3, appos=Android, appversion=null, model=null, osver=null, brand=null, pfspec=web}, fngid=81f8036d1099bd59ff93454d1f8, dname=fnp.com, user={cur=, id=sufeb@gmail.com}, wegid=null}  

2 {ename=pageload, pgloc={from=https://www.google.com/, to=https://m.amazon.com/gift/red-velvet-fresh-cream-cake?gclid=CjwKCAjw8MD7BRArEiwAGZsrBd464AGGzOLMzzaxggCPNU-onDOZuhUqzz3tB6UOIUneNq6rcduxUxoCjXwQAvD_BwE#/product-page, clikd=}, dev={ver=1.0, blang=en-US, ip=106.217.118.179, dtype=Mobile, ua=Mozilla/5.0 (Linux; Android 8.1.0; vivo 1724) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/77.0.3865.92 Mobile Safari/537.36, did=d43e7274-8116-11ea-96ae-b627f142e667, appos=Android, appversion=null, model=null, osver=null, brand=null, pfspec=web}, fngid=aeb8d109630f797980ac4cc4066d4c4b, dname=fnp.com, user={cur=, id=}, wegid=null}  

I want to extract all of the values in separate columns in same data frame. Example for id 1 user={cur=, id=sufeb@gmail.com} has values where in id 2 user={cur=, id=} has no values. Hence i would need all values irrespective of keys

Tried this doesnot works

library(tidyverse)
library(rjson)

extract_json_column <- function(dfelbs){
  dfelbs %>%
    rowwise() %>%
    mutate(data = map(common, fromJSON)) %>%
    split(.$index) %>%
    map(~.$common((1))) %>%
    map(~map_if(., function(x) length(x) != 1, list)) %>%
    map(as_data_frame) %>%
    bind_rows(.id = "index")
}

df <- do.call(plyr::rbind.fill, lapply(dfelbs(dfelbs != ""), function(x) as.data.frame(t(unlist(fromJSON(x))))))

Thanks

python – Unico conteudo json entre [ ]

Eu to com uma duvida sobre um código que eu fiz usando scrapy para coletar dados e mandar para um arquivo json. O problema é que a formatacao do arquivo nao ta como costuma ser, por isso achei estranho, estou em duvida se ha um problema ou nao. Se alguem conseguir me responder obg. Segue abaixo o codigo e o conteudo do arquivo :

(
{“uf”: “AL”, “area”: “Cu00edvel”, “juiz”: “Henrique Gomes de Barros Teixeiran”, “partes”: ({“nome”: “Maria Edite dos Santos”, “tipo”: “Autora”, “Advogado(s)”: ({“nome”: “Defensoria Pu00fablica do Estado de Alagoas”, “tipo”: “Defensor P”})}, {“nome”: “Hipercard Banco Multiplo S/A”, “tipo”: “Ru00e9u”, “Advogado(s)”: ({“nome”: “Raoni Souza Drummond”, “tipo”: “Advogado”}, {“nome”: “Eduardo Fraga”, “tipo”: “Advogado”}, {“nome”: “Andrea Freire Tynan”, “tipo”: “Advogado”})}, {“nome”: “W. dos S. F.”, “tipo”: “Testemunha”}, {“nome”: “P. V. R. de L.”, “tipo”: “Testemunha”})}
)

CODIGO:

import scrapy

class TjalSpdrSpider(scrapy.Spider):

name = 'tjal'
allowed_domains = ('www2.tjal.jus.br/cpopg/')
# url_path = www2.tjal.jus.br/cpopg/open.do
start_urls = (
    'https://www2.tjal.jus.br/cpopg/show.do?processo.codigo=01000I1FT0000&processo.foro=1&processo.'
    'numero=0731425-82.2014.8.02.0001&uuidCaptcha=sajcaptcha_2976d855423340b4be91a23ff5add85d'
)

def parse(self, response):

    table_partes = response.xpath('//table(@id="tableTodasPartes")/tr(@class="fundoClaro")')

    area = ''.join(response.xpath('//table(@class="secaoFormBody")/tr(4)/td(2)/table/tr/td/text()').getall())
    juiz = response.xpath('//table(@class="secaoFormBody")/tr(10)/td/span/text()').get()
    partes = ()

    for dados in table_partes:
        tipo = dados.xpath('./td/span/text()').get().strip()(:-1)
        tipo_adv = dados.xpath('./td(2)/span(@class="mensagemExibindo")/text()').get()
        nome = dados.xpath('./td(2)/text()').get().strip()
        advg = ({'nome': f'{adv}'.strip(),'tipo': f'{tipo_adv}'.strip()(:-1)}
                for adv in dados.xpath('./td(2)/text()(preceding-sibling::span)').getall() if adv.strip() != '')
        if nome != '':
            if tipo != 'Testemunha':
                partes.append({
                    'nome': nome,
                    'tipo': tipo,
                    'Advogado(s)': advg
                    })
            else:
                partes.append({
                    'nome': nome,
                    'tipo': tipo,
                })

    yield {
           'uf': 'AL',
           'area': area.strip(),
           'juiz': juiz,
           'partes': partes
          }

mysql – PHP Updating JSON Array in SQL Database Overwrites entire array

What I’m attempting to do is read a JSON string from a column in a database table, Decode it, Modify / Add Indexes / Values, Encode it, and save it back in the table. However, when doing this with a JSON Encoded table from Lua, submitting to the file, and creating the index for it, and rewriting it, the entire array gets cleared and is replaced with only the most recent modification. It could be a length issue, or some type of syntax issue with the Encoding / Decoding process, although I am not sure.

Code:

Write Function (php)

function write($Table,$Column,$Index,$Value){
  $conn = mysqli_connect("<insert credentials here>");

  $Data = $conn->query("SELECT ".$Column." FROM ".$Table)->fetch_assoc()($Column);
  echo($Data);
  $Decode = json_decode($Data,true);

  if($Value == null){
   unset($Decode($Index));
  }else{
   $Decode($Index) = $Value;
  }

  if(count($Decode) == 0){
   $Encode = "{}";
  }else{
   $Encode = json_encode($Decode);
  }

  $conn->query("UPDATE ".$Table." SET ".$Column."='".$Encode."';");
 }

Where the write function is called (php):
(response is just a function for returning a JSON response when data is posted)

$Training = array();
 $Training('ID') = $ID;
 $Training('Host') = $Host;
 $Training('PlaceName') = $PlaceName;
 $Training('PlaceID') = $PlaceID;
 $Training('Data') = $Data;

 write($Database,"Logs",$ID,json_encode($Training));
 response('Success','Success');
 return;

api – Is there such a thing as a freely available JSON feed of “Bitcoin events” (news, but without the fluff)?

(I asked this the other day, but it was mysteriously gone today. No trace. 404 Not Found. I assume it must be a technical glitch in the Stack Exchange system, so I’m reposting it.)

What do I mean by “Bitcoin happenings”? Well, I did not want to say “news”, because I’m not looking for an RSS feed (or even JSON feed) for some news site about Bitcoin. I also don’t mean something which tells me how the price is going up and down in real time.

I’m talking about something which I could, in theory, base my trading bot on once Bisq finally releases their API.

That is, to answer the questions “when should I buy Bitcoin?” and “when should I sell Bitcoin?”.

As far as I understand it, they had automated such systems way back in the 1980s, if not even earlier, but obviously these were not for the “everyday man”, but for rich people who had tons of money already.

But still, it’s been 40 years now, so maybe, just maybe this is now available to all?

Basically, if something happens with Bitcoin, I want to be notified ASAP by fetching that JSON blob and then basing my actions on the keywords in there. Maybe it contains “CNN” and “Bitcoin” and “plummet”, and then I will want to sell some, fearing that the market is going to start selling tons of them, thus making the value go down.

I would even find this very useful in a non-automated context, simply for learning about what happens with Bitcoin. I find all the Bitcoin news sites to be full of “noise” which doesn’t interest me and doesn’t seem relevant or interesting whatsoever.

Update: I do know about the “Fear and Greed Index”, and have implemented it into my system. However, its value is highly dubious at best, and it’s not quite what I was looking for.