| 0 | kirasiris | Categories: Tags: , , , , , , ,


New Update using multer-sharp-s3. Please go all the way down to the last paragraph to see the code. It is up to you to use it. It gives you more control over the file being sent.

Kevin Uriel Azuara Fonseca

What is this going to be About?

AWS S3!. Yes, you read that right!. Uploading files to your application might be a good approach when handling them but it can make your app slower in the long run as it requires a lot of band-width when displaying them in the front-end. Thanks God, Jeff Bezos though of a better solution, the Amazon Web Services.

Amazon Web Services is pay-to-use service but it gives you the opportunity to play with it for one free year. I would recommend you all to check it and use it as much as possible since I believe this will play a better role in the future!.

What’s Mongoose and its Models?

Mongoose is an Object Relational Mapping tool which works alongside MongoDB. It provides a straight-forward, schema-based solution to model your application data. It includes built-in type casting, validation, query building, business logic hooks and more, out of the box!.

Models are to be used as link between your application and your MongoDB database. Some of the methods that comes with it are findById, findOne, findOneAndUpdate and many more.

What a Mongoose Model looks like?

We have now seen the definition of Mongoose but its definition is not going to help us at all when it comes to coding. Instead I recommend you to have a look in two of the fields that I’m using in my Model; these two fields are to be used when uploading a new profile avatar picture and when posting a new article with an array of images.

avatar: {
  type: String,
  trim: true,
  required: false
images: {
  type: [String],
  trim: true,
  required: [true, 'Please add images']

The difference is easy to spot; the avatar fields only accepts one string and the images field can accept as many strings as you want.

NPM install!

NPM is a module manager with access to thousands and maybe millions of packages already included that can easily improve your coding flow.

The actual approach to have a module installed in your app is by following the syntax below.


This is the usual way to install your modules but it can be tedious when you’re dealing with more that 5. In that case the best way to add them would be to type the name of all the modules in one single line instead of just one as shown previously. Take a look to this:

npm i firstModule secondModule thirdModule andSoOn


In order to work with AWS S3, we have to install 3 modules. The first one is aws-s3 which provides code to connect to your AWS S3 Bucket and multer which handles file data; lastly we have multer-s3 which is similar to multer but it returns data coming from Amazon after a successful upload.

const aws = require('aws-sdk');
const multer = require('multer');
const multerS3 = require('multer-s3');

The main difference between multer and multer-s3 is found in their returned data.

multer returns data found in the memory/disk/local storage while multer-s3 returns Amazon S3 data.

– Kevin Uriel Azuara Fonseca

Once installed, require them(as shown above) within the file in which your doing your API requests; that’s the file you normally call controllers. On that same file you have to copy and paste the code below in order to make use of it. Here it is the code:

// Filter objects
const multerFilter = (req, file, cb) => {
  if (file.mimetype.startsWith('image')) {
    cb(null, true);
  } else {
      new ErrorResponse('Not an image!. Please upload only images', 400),
// Upload files to AWS S3
const s3 = new aws.S3({
  accessKeyId: process.env.AWS_ACCESS_KEY_ID,
  secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY

const upload = multer({ // Opens a multer object
  fileFilter: multerFilter, // uses multerFilter as multer's fileFilter
  storage: multerS3({ // Opens multerS3 object to accepts different fields
    s3: s3, // s3 calls the keys
    bucket: process.env.AWS_BUCKET_NAME, // This is the name of YOUR Bucket
    acl: 'public-read', // Some rules applied for each single file uploaded
    key: function(req, file, cb) { // This is the name of the file which accepts a request, file and a call-back function
      const strOne = process.env.WEBSITE_NAME + '-';
      const userId = req.user.id + '-';
      const userEmail = req.user.email + '-';
      const todaysDate = Date.now().toString() + '.';
      const extension = file.mimetype.split('/')[1];
      const finalStr = strOne.concat(userId, userEmail, todaysDate, extension);
      cb(null, finalStr); // If there's not errors throw null.

exports.uploadUserAvatar = upload.single('avatar'); // A route that only accepts one file. REPLACE avatar with the name of your model's field
exports.uploadMultipleImages = upload.array('images); Do that same here, this routes accepts as many files as you want, you can set a second parameter to limit the files sent. Look for it on Internet!.

As of now, this code works but it’s not that good as you will have to repeat yourself if you want to use it in more than one controller. I’ll be working on a middleware/helper/class to make it more dynamic.

Kevin Uriel Azuara Fonseca

The code might be confusing at first but not that hard to understand after knowing how to implement it. Read the comments I’ve written on it also!.

The explanation

The first const multerFilter serves as a file extension limiter. You can only send image files to AWS S3 in our case and that’s that. const s3 calls the keys from AWS S3 that are given to you after creating an account; without these two, I’m afraid that your code will NOT work. The last one, const upload is the one behind the magic and makes use of the const s3.

Here its the reason you need ExpressJS

Any time you create a route in express, you do something like this:

// Install and require express
const express = require('express');
// Instantiate a const to access to the Express Router method
const router = express.Router();
* Example of declaring routes in NodeJS and ExpressJS
// Let's call the one route we're using for this project
router.route('/updatebasics').put(uploadUserAvatar, updateBasics);

updateBasics usually receives the data your sending such as the username, email, work status, address and so on; the uploadUserAvatar is making sure that everytime a request is sent to updateBasics, the request goes through itself before passing to the targeting endpoint.

As seen before, the route is calling the uploadAvatar we’ve written before is not it?; that same uploadAvatar is equal to upload.single('avatar).

Sending the files to an API request

if (req.file) profileFields.avatar = req.file.location; // For single files
// For multiple files, run through it with a loop and retrieve the location of each file.
if (req.files) {
  req.body.images = req.files.map(image => image.location);

If you’re using Postman, you will have to send the data as ‘form-data’ not as JSON as it is usual when working with APIs

Kevin Uriel Azuara Fonseca

Improving our previous code with multer-sharp-s3

Multer Sharp S3 is an open source library that does exactly the same as multer-s3 but giving you more control over the file being sent. You can manipulate it by resizing it with multiple suffixes and many more features that I’m sure will be useful to you.

const multer = require('multer');
const aws = require('aws-sdk');
const multerSharp = require('multer-sharp-s3');
const s3 = new aws.S3({
  accessKeyId: process.env.AWS_ACCESS_KEY_ID,
  secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
const multerFilter = (req, file, cb) => {
  if (file.mimetype.startsWith('image')) {
    cb(null, true);
  } else {
    cb(new AppError('Not an image! Please upload only images.', 400), false);
const upload = multer({
  fileFilter: multerFilter,
  storage: multerSharp({
    Key: (req, file, cb) => {
      const strOne = 'project-';
      const userId = `${req.user.id}-`;
      const todaysDate = `${Date.now().toString()}.`;
      const extension = file.mimetype.split('/')[1];
      const finalStr = strOne.concat(userId, todaysDate, extension);
      cb(null, finalStr);
    Bucket: process.env.AWS_BUCKET_NAME,
    ACL: 'public-read',
    resize: {
      width: 2000,
      height: 1333
    toFormat: {
      type: 'jpeg',
      options: {
        progressive: true,
        quality: 50

Using multer-sharp-s3 is OPTIONAL. It can be very useful to prevent a high bandwidth but unless you’re creating a small project that will not require too much fetching of images, then I would recommend you to skip it. Is not ok to overload your server requests time by wanting to use a tool that might not even be good for your project size.

Kevin Uriel Azuara Fonseca

BIG UPDATE! preSignedURLs!

Now, instead of processing the files in the backend side of X application. Don’t you think it would be better to handle the files in the frontend?

The reason of this new outcome was due to repetition, I had to repeat myself several occasions in order to have the same functionality and the thing was the code was so damn big and it pissed me off. It required from 3 libraries, some settup in each of the routes the function was to be used with and gosh, I just hate it.

What are preSignedURL?

A preSignedURL is an URL that you can provide to your users to grant temporary access to a specific S3 object. Using the URL, a user can either READ, WRITE or even DELETE an Object. The URL containts specific parameters which are set by your application.

Aidan Hallet

With that being said, the logic is somewhat still similar as the code block shown above(one of them, not sure which one xD). The difference is that it now can be used to scalate!. The code is also on its own file/controller/route/younameit. Take a look on it:

const s3 = new aws.S3({
  accessKeyId: process.env.AWS_ACCESS_KEY_ID,
  secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
  region: `us-east-2`,
  signatureVersion: `v4`

// @desc    Get preSignedURL from Amazon AWS S3
// @route   GET /api/v1/uploads/uploadObject
// @access  Private
// @status  DONE
exports.uploadObject = asyncHandler(async (req, res, next) => {
  const { type } = req.query;
  const fileExtension = type.substring(type.indexOf('/') + 1);
  const key = `${process.env.WEBSITE_NAME}-${
    .replace(/[^a-z0-9]/g, '-')}-${Date.now().toString()}.${fileExtension}`;

  const params = {
    Bucket: process.env.AWS_BUCKET_NAME,
    Key: key,
    ContentType: type,
    Expires: 5000,
    ACL: 'public-read'

  await s3.getSignedUrl(`putObject`, params, (err, signedURL) => {
    if (err) {
      return next(
        new ErrorResponse(
          `There was an error with the files being uploaded`,
      success: true,
      postURL: signedURL,
      getURL: signedURL.split(`?`)[0]

The key poins in the code are the signatureVersion from the s3 Instance, Expires from the params object and the getSignedURL function along the putObject parameter.

After that, you can modify the returned json as much as you want and as long as it makes sense to you when it comes to calling this function from the front end of your app. On my case….I will use the getURL response.

Not to forget, this is what my router looks like:

const express = require('express');

const { uploadObject} = require('../../controllers/uploads');

const router = express.Router();


module.exports = router;

Calling the uploadObject function from the Front End…

Here it is an easy input built with react-bootstrap:

<Form.File label={posterName} accept={`image/*`} custom onChange={onDropPoster} required />
<hr />
<Progress percentage={secondUploadPercentage} />
<hr />
<Button variant={`secondary`} size={`sm`} className={`my-1`} block type={`submit`} disabled={isSecondDisabled} onClick={handleUploadThumbnail}>
    Upload Thumbnail

The input first receives a fileList and then passes it to the onDrop function which looks exactly likes this:

const [poster, setPoster] = useState(null);
const [posterName, setPosterName] = useState('Choose file');
const [posterURL, setPosterURL] = useState('');
const onDropPoster = e => {

This function then set the file as the poster ‘file’ to be used. Secondly we get the name of the file and then we move into the handleUpload function:

const handleUploadThumbnail = async e => {
    const posterUploaded = await uploadObject(poster, setSecondUploaderPercentage);

That’s about it, we proceed to the uploadObject which will upload and retrive the URL of your file. Pay close attention to the code below, it has authentication due to my securiry concerns. You can opt it out by just deleting those lines, don’t forget to set up your bucket policy to NOT RECEIVE AUTHENTICATION header.

export const uploadObject = (file, setUploadPercentage, Timeout) => async dispatch => {
    try {

      // const token = localStorage.getItem("xAuthToken");
      //api.defaults.headers.common["Authorization"] = `Bearer ${token}`
      const uploadConfig = await api.get(`/uploads/uploadObject?name=${file.name}&type=${file.type}&size=${file.size}`);
      // delete api.defaults.headers.common['Authorization'];
      const res = await api.put(uploadConfig.data.postURL, file, {
        headers: {
          'Content-Type': file.type
        onUploadProgress: ProgressEvent => {
            parseInt(Math.round(ProgressEvent.loaded * 100) / ProgressEvent.total)
            setTimeout(() => setUploadPercentage(0), 10000);
      //api.defaults.headers.common["Authorization"] = `Bearer ${token}`;

      return uploadConfig.data;
    } catch (err) {

      const errors = err?.response?.data.errors;
      if (errors) {
        errors.forEach(error => dispatch(setAlert(error.msg, 'danger')));
      return errors;

That’s all!. If you found this article useful, please share it with your friends. If you think it can be improved, please tell me also!. Bye-Bye 🙂 .

Leave a Reply

Back to Top