java – How to extract a record from the database using a spring batch based on an identifier

package com.csvpostgresspringbatch.config;

import com.csvpostgresspringbatch.dao.BankDAO;
import com.csvpostgresspringbatch.model.Bank;
import com.csvpostgresspringbatch.step.Listener;
import com.csvpostgresspringbatch.step.Processor;
import com.csvpostgresspringbatch.step.Reader;
import com.csvpostgresspringbatch.step.Writer;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.launch.support.RunIdIncrementer;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;

@Configuration
@EnableBatchProcessing
public class BatchConfig {

	@Autowired
	public JobBuilderFactory jobBuilderFactory;

	@Autowired
	public StepBuilderFactory stepBuilderFactory;

	@Autowired
	public BankDAO bankDao;




	@Bean
	public Job job() {
		return jobBuilderFactory.get("job").incrementer(new RunIdIncrementer()).listener(new Listener(bankDao))
				.flow(step1()).end().build();
	}

	@Bean
	public Step step1() {
		return stepBuilderFactory.get("step1").chunk(1)
				.reader(Reader.reader("bank-data.csv"))
				.processor(new Processor())
				.writer(new Writer(bankDao))
				.build();

	}



}

database design – Are several dbs bad?

I am a junior programmer who has no training in dbs, I only meet mongodb during my work as an annual programmer and that's it. Finally, I try to understand the benefits of micro-services design, and one of the reasons was "if you want to scale some of the application but that you only want to leave a write service for dbs, because you never want to have more than one db (or duplicate?). "

My question is, is it really wrong to have more than one database for the same table / schema? Because it seems problematic on the scale.

Thanks, or

Ethics – When cleaning a hacked WordPress site, I had access to the hacker's database. What action should I take?

I just finished cleaning a compromised WordPress site; it was hacked through a vulnerable plugin and several fake sitemaps and redirections configured to pollute Google's results with pharmaceutical ads, etc.

After dealing with this, I noticed that the hacker had also compromised files from an old copy of the site that was still in the public_html directory (or that they had created it). themselves).

By checking the wp-config file here, I found information identifying access to a database containing several thousand tables, all of which were WordPress tables with different prefixes. I've looked at some of the "options" tables to find the address and visited this URL, thus confirming my suspicions that it was copies of the data from the site. other hacked sites.

It is important to note that I have not accessed any personal data, only public URLs of wordpress sites.

I do not know what to do about it. My instinct is to try to get a list of the site owner's emails from this database, then delete all the tables and contact the owners of the site to inform them of the violation. However, I'm not sure if I can be held responsible, because I'm obviously not allowed to get that data either.

I am in the UK, so the GDPR is relevant.

Should I take action?

database – Job SQL Server 2016 that copies files from a Windows server to a Linux server

I have a trigger that generates XML files and leaves them in a folder. These files will be used by another application. The company's policies do not allow to extract them from the folder where I create them, because this folder is on the server. where the database is installed, I was thus asked to create a job moving these files on a SFTP linux server, otherwise the trigger could be sent directly to the other server.
I therefore need a script that can do this job.

CREATE PROCEDURE dbo.pa_gea_crear_xml_bmcbridge
(
@p_ntra INTEGER – # transaction number
)
AS
To start

    DECLARE @l_Id INT                   --# ERROR O EXITO DE LA OPERACION
    DECLARE @l_Msje VARCHAR(1000)       --# Mensaje de error o exito de la operacion
    DECLARE @l_Ruta VARCHAR(250)        --#Ruta del directorio
    DECLARE @l_Trx_typ VARCHAR(32)      --#Valor del campo TRX_TYP
    DECLARE @l_Sql_Cmd VARCHAR(1000)    --#sentencia sql
    DECLARE @l_Sql_Shell VARCHAR(1000)  --#setencia para la consola shell
    DECLARE @l_Nom_Arch VARCHAR(16)     --#nombre del archivo

    BEGIN TRY

        SET @l_Ruta  ='T:RemesasLiq_RemeBackup';
        SET @l_Sql_Cmd = '';
        SET @l_Sql_Shell = '';
        SET @l_Trx_typ = '';
        SET @l_Nom_Arch = 'TLOG';

        SELECT @l_Trx_typ = LTRIM(RTRIM(TRX_TYP)) FROM EXP_POSLOG WHERE ID = @p_ntra;

        IF (@l_Trx_typ = 'TenderOutflow')
        BEGIN
            SET @l_Nom_Arch = 'TLOG'+'_'+Cast(@p_ntra as varchar(10))+'.XML'
            SET @l_Sql_Cmd = 'SELECT POSLOG FROM EXP_POSLOG WHERE ID = '+ Cast(@p_ntra as varchar(10));
            Set @l_Sql_Shell = 'EXEC xp_cmdshell ' +char(39) +'bcp' +' "'+@l_Sql_Cmd+'"'+  ' QUERYOUT '+ '"'+@l_Ruta+@l_Nom_Arch+'"' +' -T -c -t"t"'+char(39);                                                             
            Execute  (@l_Sql_Shell) 

        END


    END TRY
    BEGIN CATCH

        SET @l_Id =   ERROR_NUMBER()
        SET @l_Msje = ERROR_MESSAGE()

        SELECT @l_Id as 'id', @l_Msje as 'msje'

    END CATCH

END

GO

CREATE TRIGGER tr_gea_exp_poslog
ABOUT EXP_POSLOG
AFTER INSERT
AS
DECLARE @l_IdInsertado INTEGER
To start
SET NOCOUNT ON;
select @l_IdInsertado = the inserted ID;
EXEC pa_gea_crear_xml_bmcbridge @l_IdInsertado;
END
GO;

7 – Is it normal for a database to contain 3,600 tables?

These tables are tables for the fields. Whenever you add a field to a content type, Drupal creates two tables, one to store the field value and another to store the historical data for that field.

Deleting these tables will break your site.

It just means that you have a lot of fields. Usually in Drupal 7, it's a watchdog table that contributes to the size of your database, unless you want to keep historical log data, feel free to erase some data watchdog table.

dbms – How to write a SQL database to a file?

I write a password manager in which one of the features that I would like to include is the ability to transfer the database of passwords from one computer to another or to export this database. So since my password database will be stored as an SQL database (it could be any SQL server such as MySQL, Microsoft SQL Server, PostgreSQL, Oracle or even SQLite). I need to be able to transfer this database from one computer to another. Therefore, I think that it is necessary to write this database into a file. If not, how could I make the database portable?

SSRS Security Audit – Exchange Stack of Database Administrators

Battery Exchange Network

The Stack Exchange network includes 175 question-and-answer communities, including Stack Overflow, the largest and most reliable online community on which developers can learn, share knowledge and build their careers.

Visit Stack Exchange

How to get all the tables in the informix database and their sizes in GB?

I can get a rough estimate of the table sizes with

SELECT tabname, (rowsize*nrows)/1024/1024/1024 AS tabsize 
FROM informix.systables 
ORDER BY tabsize DESC

Is it possible to get a more accurate measurement of the size of the tables?

database – Optimize the web dashboard with editable elements to improve performance by introducing ElasticSearch

To improve performance, we optimize our complex tabular web dashboard that loads many data (multiple columns and rows). Many cells in the table are editable so that data can be modified. Our main data store is MySQL and we also replicate the data in ElasticSearch. Optimization involves retrieving ElasticSearch data instead of MySQL when loading the page (but continues writing data on both MySQL and ElasticSearch).

This is a critical race situation in which a cell is updated (both in MySQL and ElasticSearch) and the page is quickly reloaded after the cell is updated. We would like to ensure data consistency so that ElasticSearch does not load old data in case the page reloading occurs before ElasticSearch is updated by the front-end server (React / Redux).

Common architectural models / services to solve this problem? Most dashboards, I see that the data extracted from ElasticSearch are read-only dashboards, hence the problem.

java – How to save multiple images with a one-to-many relationship in a SQLite database

I develop a simple form in which the user can attach multiple images and save locally with the help of SQLite. I am able to save other data and have already saved it when attaching a single image. However, you must create the one-to-many patient relationship for photo and save all photos from the form. Since I am now learning how to use SQLite and I do not have much experience with Android, I can not save multiple images of a single form and create the relationship one to several of the SQLite schema.

INSERT DATA CODE IN BANK

public class PacienteDAO {

private Conexao conexao;
private SQLiteDatabase banco;
private byte() imageInBytes;

public PacienteDAO(Context context) {
    conexao = new Conexao(context);
    banco = conexao.getWritableDatabase();
}


public long inserirPaciente(Paciente paciente){

   /*Bitmap() imageToStoreBitmap = paciente.getFotos();
    ByteArrayOutputStream objectByteArrayOutputStream = new ByteArrayOutputStream();
    imageToStoreBitmap.compress(Bitmap.CompressFormat.PNG, 100, objectByteArrayOutputStream);
    imageInBytes = objectByteArrayOutputStream.toByteArray();*/

    ContentValues values = new ContentValues();
    values.put("nome", paciente.getNome());
    values.put("cpf", paciente.getCpf());
    values.put("data", paciente.getData());
    values.put("foto",imageInBytes);
    return banco.insert("paciente", null, values);

}
}

Creating the table

@Override
public void onCreate(SQLiteDatabase db) {

    db.execSQL("create table paciente(id integer primary key autoincrement, nome varchar(50), cpf varchar(11), data varchar(15), fotos BLOB )");
    //db.execSQL("create table foto(id integer primary key autoincrement, enviado int(2), fotoBitmap BLOB,  FOREIGN KEY(pacienteId) REFERENCES paciente(id))");
    Log.d("TAG", "Banco criado com sucesso");

}

In this part, I add all the images, whether from the mobile gallery or taken by the camera in a list of tables: final List bitmaps = new ArrayList<>();

@Override
public void onActivityResult(int requestCode, int resultCode, Intent data) {
    super.onActivityResult(requestCode, resultCode, data);
    final ImageView imageView = findViewById(R.id.imageDisplay);
    if(requestCode == 0 && resultCode == RESULT_OK && data != null) {
        Bitmap bt = (Bitmap) data.getExtras().get("data");
        bitmaps.add(bt);

    }else {
        ClipData clipData = data.getClipData();
        if (clipData != null) {
            for (int i = 0; i < clipData.getItemCount(); i++) {
                Uri uri = clipData.getItemAt(i).getUri();

                try {
                    InputStream inputStream = getContentResolver().openInputStream(uri);
                    Bitmap bitmap = BitmapFactory.decodeStream(inputStream);
                    bitmaps.add(bitmap);

                } catch (FileNotFoundException e) {
                    e.printStackTrace();
                }

            }
        }else {
            Uri uri = data.getData();
            try {
                InputStream inputStream = getContentResolver().openInputStream(uri);
                Bitmap bitmap = BitmapFactory.decodeStream(inputStream);
                bitmaps.add(bitmap);
            } catch (FileNotFoundException e) {
                e.printStackTrace();
            }
        }

    }
  }

The patient model:

public class Paciente implements Serializable {

@NonNull
@Expose
private String nome;
@NonNull
@Expose
private String cpf;
@NonNull
@Expose
public  List fotos;
@Expose
@NonNull
private String data;

...GETTER E SETTER.

Patient is the Photo class that contains the list of captured images.

public class Foto implements Serializable {
@Expose
private List fotoBitmap;
@Expose
private boolean enviado;
...GETTER E SETTER.