Code Bug Fix: Getting 400 Bad Request When POSTing to Get Transaction Token

Original Source Link

I’m trying to integrate our website with Converge API with Hosted Payments Page. Here is the link to their documentation https://developer.elavon.com/#/api/eb6e9106-0172-4305-bc5a-b3ebe832f823.rcosoomi/versions/5180a9f2-741b-439c-bced-5c84a822f39b.rcosoomi/documents?converge-integration-guide/book/integration_methods/../../book/integration_methods/hosted_payments.html

I’m having troubles getting past the first step which is requesting a transaction token from their API endpoint. I’m sending a POST request from my server using axios with the correct parameters and URL, but when I try and POST i get 400 Bad Request. When I make the same request in POSTMAN I get a 200 response with the transaction token. I talked to their developers and they said that everything I was doing was correct and that nothing seemed odd within my code, so even they were stumped as to why I couldn’t make a POST request to their endpoint. Obviously there is something within my code that their API is not liking, or else I wouldn’t be here trying to find answers for this.

Here is how I’m making the POST request:

app.get('/converge_token_req', (request, response) => {

    let params = {
        ssl_merchant_id: '*****',
        ssl_user_id: '*****',
        ssl_pin: '*****',
        ssl_transaction_type: 'ccsale',
        ssl_amount: '1.00'
    }

    axios.post('https://api.demo.convergepay.com/hosted-payments/transaction_token', params, {
        headers: { 'Content_Type' : 'application/x-www-form-urlencoded' }
    }).then((res) => {
        response.send(res.data)
    }).catch((error) => {
        console.log('there was an error getting transaction token')
        response.send(error.message)
    })

})

Here are the Request Headers:

enter image description here

I’m honestly out of ideas to try. The developers say that everything looks just fine yet I’m unable to make a successful request to their API. If anyone has any thoughts on this that would be great. Thanks!

This code below worked for me:

app.get('/converge_token_req', (request, response) => {

let params = {
    ssl_merchant_id: '*****',
    ssl_user_id: '*****',
    ssl_pin: '*****',
    ssl_transaction_type: 'ccsale',
    ssl_amount: '1.00'
}

  axios({
    method: 'post',
    url: 'https://api.demo.convergepay.com/hosted-payments/transaction_token',
    params: params
    }).then((res) => { response.send(res.data)
    }).catch((error) => {
    console.log('there was an error getting transaction token: ', 
    error)
  })

})

Tagged : / /

Code Bug Fix: Validation error after transaction commit

Original Source Link

I am working on a node project that uses Sequelize 5.21.5 with MySql 5.5 as database engine.

I have a method that inserts values into a linking table that is used in a many-to-many relationship. There is a unique constraint on the linking table that dictates that the combination of the two id fields must be unique.

enter image description here

I want to be able to insert multiple entries into the user_levels table at the same time. I tried doing this by using a sequelize transaction. I first add a userLevel with a userId of 1 and a levelId of 9 (does not yet exist) and then I add another one with a userLevel of 1 and a levelId of 8 (which does exist).

I expected the transaction to fail and be rolled back, however the transaction was committed and only the userLevel with levelId of 9 was added and a validation error was thrown for the userLevel with the levelId of 8.

How can I have the transaction fail as a whole and have it roll-backed instead?

  updateLevelsForUser = async (req: Request, res: Response): Promise<any> => {
    const userId: number = parseInt(req.params.userId, 10);
    const levelIds: number[] = [9, 8];

    Database.getInstance().sequelize.transaction(async (t) => {
      for (let i = 0; i < levelIds.length; i += 1) {
        UserLevel.create<UserLevel>(
          {
            userId,
            levelId: levelIds[i],
          },
          { transaction: t },
        );
      }
    });
  };

Output:

Executing (0efdccac-6d93-4434-8ac4-1ff291d65178): START TRANSACTION;
Executing (0efdccac-6d93-4434-8ac4-1ff291d65178): INSERT INTO `user_levels` (`id`,`userId`,`levelId`) VALUES (DEFAULT,?,?);
Executing (0efdccac-6d93-4434-8ac4-1ff291d65178): INSERT INTO `user_levels` (`id`,`userId`,`levelId`) VALUES (DEFAULT,?,?);
Executing (0efdccac-6d93-4434-8ac4-1ff291d65178): COMMIT;
Unhandled rejection SequelizeUniqueConstraintError: Validation error

You forgot to add await before UserLevel.create

Database.getInstance().sequelize.transaction(async (t) => {
      for (let i = 0; i < levelIds.length; i += 1) {
        await UserLevel.create<UserLevel>(
          {
            userId,
            levelId: levelIds[i],
          },
          { transaction: t },
        );
      }
    });

Tagged : / / /

Code Bug Fix: Using Caching to fetch data form sql server

Original Source Link

How to import sql table data to redis cache ?and I do this code , is this correct?
because the request can’t send,I get this Error:
GET /product/Data – – ms – –

userController.js:

    const redisclass = require('../Models/redisModel');
    const redis = require("redis");
    const redisUrl = 'redis://127.0.0.1:6379';
    const client = redis.createClient(redisUrl);
    const Data = async (req, res, next) => {
                    cache, getData
     }
   async function getData(req, res, next) {
      try {
         console.log('Fetching Data From SQL server...')
         const dbResult = await redisclass.findAll(req.Product)
         const allProducts = dbResult.recordset;
         const data=await Response.json(allProducts);
         return res.send({ message: '', products: data })
           }
         catch (error) {
         console.error(error);
         res.status(500).json({ error: error })
           }
          }
          // Cache midleware
          async function cache(req, res, next) {
          console.log('Fetching Data From Cache...')
          const Pull_Products = await SingletonSqlHandler.instance.sendQuery('SELECT * from 
          dbo.getAllProducts()')
          client.get(Pull_Products, (error, cachedData) => {
          if (error) throw error
          if (cachedData != null) {
           res.send(setResponse(Pull_Products, cachedData));
           }
           else {
              next();
       }
      })
    }
    module.exports = {
          Data: Data
    }  

redisModel.js:

const SingletonSqlHandler = require('../SQLServer/singleton');
class redisclass {
 static async findAll() {
  const Pull_Products=await SingletonSqlHandler.instance.sendQuery('SELECT * from 
             dbo.getAllProducts()')        
        return Pull_Products;
        }
     }
 module.exports = redisclass

redisRoute.js:

  const router = require('express').Router();
  const Redis=require('../redis-cache/server')
  router.get('/Data', Redis.Data);
  module.exports = router;

Tagged :

Code Bug Fix: Delete all channels in a specific category

Original Source Link

How to delete all channels in a specific category in Discord.JS?
I tried that

oldMember.guild.channels.cache.get(client.tr["Settings"].MainChannelID).parent.channels.cache.forEach(c => {
        if(c.members.size != 0){
            c.delete();
        }
    })

You can get all the channels that belong to a category using CategoryChannel.children

const category = await oldMember.guild.channels.cache.get(CATEGORY_ID); // You can use `find` instead of `get` to fetch the category using a name: `find(cat => cat.name === 'test')
category.children.forEach(channel => channel.delete())

Tagged : / / /

Code Bug Fix: my function runs faster than the interval set

Original Source Link

I have this code, which is supposed to send a message and add to a variable every 10 minutes

function btcb() {
          const embed = new Discord.MessageEmbed()
          .setColor('#FF9900')
          .setTitle("Bitcoin block #"+bx.blocks.btc+" was mined")
          .setAuthor('Block mined', 'https://cdn.discordapp.com/emojis/710590499991322714.png?v=1')
          client.channels.cache.get(`710907679186354358`).send(embed)
          bx.blocks.btc = bx.blocks.btc+1
        }
    setInterval(btcb,600000)

But it actually does it every 2-3 minutes instead. What am I doing wrong?

Youre better off setting the interval to 1 second and counting 600 seconds before resseting:

let sec = 0;

function btcb() {

if(sec++<600) return;
sec = 0;

  const embed = new Discord.MessageEmbed()
  .setColor('#FF9900')
  .setTitle("Bitcoin block #"+bx.blocks.btc+" was mined")
  .setAuthor('Block mined', 'https://cdn.discordapp.com/emojis/710590499991322714.png?v=1')
  client.channels.cache.get(`710907679186354358`).send(embed)
  bx.blocks.btc = bx.blocks.btc+1
}

setInterval(btcb,1000)

Tagged : /

Code Bug Fix: Push Array Items to Array type Column in mongoDb

Original Source Link

This is a Controller in which I’m trying to catch multiple candidates id(ObjectId) and try to store it in the database in the array Candidates. But data is not getting pushed in Candidates column of Array type.

routes.post('/Job/:id',checkAuthenticated,function(req,res){
var candidates=req.body.candidate; 
console.log(candidates);
Job.update({_id:req.params.id},{$push:{Appliedby : req.user.username}},{$push:{Candidates:{$each: 
candidates}}}
});

Console screens output

[ '5eb257119f2b2f0b4883558b', '5eb2ae1cff3ae7106019ad7e' ] //candidates

you have to do all the update operations ($set, $push, $pull, ...) in one object, and this object should be the second argument passed to the update method after the filter object

{$push:{Appliedby : req.user.username}},{$push:{Candidates:{$each: candidates}}

this will update the Appliedby array only, as the third object in update is reserved for the options (like upsert, new, ….)

you have to do something like that

{ $push: { Appliedby: req.user.username, Candidates: { $each: candidates } } }

then the whole query should be something like that

routes.post('/Job/:id', checkAuthenticated, function (req, res) {
    var candidates = req.body.candidate;
    console.log(candidates);

    Job.update(
        { _id: req.params.id }, // filter part
        { $push: { Appliedby: req.user.username, Candidates: { $each: candidates } } } // update part in one object
    )
});

this could do the trick I guess, hope it helps

Tagged : / / / /

Code Bug Fix: Push Notifications For An OTP Mobile Application

Original Source Link

I was given the following challenge: “I want you to develop a mobile application based on react native. It’s an authentication app. Which means I want to open a website and try to login and when I do. I will receive a push notification on the device when I open it I will get a code to enter on the website when I do I should be logged in”.

I didn’t start with the code yet, I’m trying to figure out how the process will be done.
I was thinking of implementing a TOTP algorithm using Node.JS, but then I got stuck on how the app will know that we logged in on a website and send a push notification.

You’ll need some sort of server that the website will make a request to. When the server gets the request, it will need to look up a device token for the user logging in, and send a push notification to that device. One simple way to do this is with firebase database and cloud functions.

  1. Run a simple api on a cloud function, which you will call from the website on login. It will check your database for user info and device tokens, and send an appropriate push if necessary. You can use nodeJS for this.

  2. When a user signs up for/logs into their app, use react-native-firebase to get their device token, update this in your database. Add handling for when a notification is opened, and give them a code.

  3. Make sure you use database security rules to ensure the device tokens and codes are secured.

Tagged : / / / /

Code Bug Fix: Add prefix to XML with lib xml2js

Original Source Link

When going from XML to JS, the “processors.stripPrefix” allows you to remove the prefix. Is there any option to add prefix ?

const jsonObj = {
  foo: {
    bar: {
  hello: 'world'
    }
  }
};
const builder = new xml2js.Builder();
const xml = builder.buildObject(jsonObj);
console.log(xml);

//I need this result
<prefix:foo>
  <prefix:bar>
    <prefix:hello>world</prefix:hello>
  </prefix:bar>
</prefix:foo>

Any solution please ??

Based on the official documentation it does not have a feature to add prefixed keys.

You’d have to add them yourself. So this is a workaround that would work for simple objects

const xml2js = require('xml2js')
const jsonObj = {
  foo: {
    bar: {
      hello: 'world'
    }
  }
}
const builder = new xml2js.Builder()
const prefix = 'abc'
const prefixedObj = JSON.parse(
  JSON.stringify(jsonObj)
    .replace(/"([^"]+)":/g, `"${prefix}:$1":`))
const xml = builder.buildObject(prefixedObj)
console.log(xml)

This will produce

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<abc:foo>
  <abc:bar>
    <abc:hello>world</abc:hello>
  </abc:bar>
</abc:foo>

Tagged : / /

Code Bug Fix: Axios post request with with credentials and body

Original Source Link

I am trying to make the same request below in Axios, but it seems to not work. Does anyone have any idea?

curl -X POST -vu CLIENT_ID:CLIENT_SECRET http://localhost:9001/oauth/token -d "grant_type=password&username=user&password=password"

in Axios:

var reqData = "grant_type=password&username=username&password=password"

return axios.post("http://localhost:9001/oauth/token", reqData , {
      headers: {
        "Postman-Token": "4b7526c0-a204-4ad5-6194-49e0cd1bfd12",
        "Cache-Control": "no-cache",
        "Authorization": "Basic "+base64.encode(CLIENT_ID + ':' + CLIENT_SECRET),
        "Content-Type": "application/x-www-form-urlencoded"
      }
    }).then(response => {
      console.log(response)
    return response})

Tagged : / /

Code Bug Fix: How to send res.json() in Express JS from a for loop

Original Source Link

I am trying to send a json response when an API call is triggered from front-end, I’m not able to send the res.json() when I am getting the data from a for loop. Where I am writing a query to search in multiple Tables. I am using RethinkDB.

I want res.json() to send data after the query, But I don’t understand what mistake I am doing. 🙁

Thanks in advence

zeasts

Here is the following code and Fiddle Link too.

const express = require("express");
const router = express.Router();
const moment = require('moment');
const r = require('rethinkdb');
const tableNameDB = ['assets', 'alerts',  'destinations']


router.post('/', (req, res, next) => {
        let resData = []
        let searchValue = req.body.searchValue,
            tableName = req.body.tableName;
        newCallForSearch(res, searchValue, resData)
    })

function newCallForSearch (res, searchValue, resData){
    let anArray = ['captain']
    for(var i = 0; i<tableNameDB.length; i++){
        let tabName = tableNameDB[i]
        r.table(tableNameDB[i]).filter(function(doc) {
            return doc.coerceTo('string').match(searchValue);
        }).run(rethink_conn, (err, cur) => {
           // console.log(cur)
            if (err) {
                return 0
            } else {
                cur.toArray((err, result) => {
                    if (err) {
                        return 0
                    } else if (result) {
                        let Results = []
                        Results = Object.values(result).slice(0,10)
                        var newResults = Results.map(function() {
                            resData = Object.assign({'tableName': tabName},{'data' : result});
                            anArray.push(resData)
                        })
                    }
                })    
                
            }
                
        })
    }   
    res.status(200);
    res.json(anArray); 
}

module.exports = router;

RethinkDb is a functional database and so using it in a functional way will yield the least resistance. We can accomplish more by writing less code.

You can use Promise.all to run many subqueries and pass the result to res.send

const search = (table = "", query = "", limit = 10) =>
  r.table(table)
  .filter(doc => doc.coerceTo("string").match(query))
  .toArray()
  .slice(0, limit)
  .do(data => ({ table, data }))

const tables =
  ['assets', 'alerts',  'destinations']

const subqueries =
  tables.map(t => search(t, "foo").run(rethink_conn)) // <-- search each table

Promise.all(subqueries) // <-- run all subqueries
  .then(result => {     // <-- handle success
    res.status(200)
    res.json(result)
  })
  .catch(e => {         // <-- handle failure
    res.status(500)
    res.send(e.message)
  })

Or use .union to produce a single query –

const search = (table = "", query = "", limit = 10) =>
  r.table(table)
  .filter(doc => doc.coerceTo("string").match(query))
  .toArray()
  .slice(0, limit)
  .do(data => ({ table, data }))

const searchAll = (tables = [], query = "", limit = 10) =>
  tables.reduce
    ( (r, t) => r.union(search(t, query, limit)) // <-- union
    , r.expr([]) // <-- if no tables are supplied, return empty result
    )

const tables =
  ['assets', 'alerts',  'destinations']

searchAll(tables, "foo") // <-- single rethink expr
  .run(rethink_conn)     // <-- run returns a promise
  .then(result => {      // <-- handle success
    res.status(200)
    res.json(result)
  })
  .catch(e => {          // <-- handle failure
    res.status(500)
    res.send(e.message)
  })

I should remark on the proposed use of filter in your original post –

.filter(doc => doc.coerceTo("string").match(query))

This is quick but it is also sloppy. It matches query against any of docs values, but also the doc‘s keys. And if doc is a complex nested document, it matches them too. User beware.

Tagged : / / /