3.1.The input is an array $A$ of $n$ distinct, positive integers. Each integer is the processing time of a job. For any ordering of the jobs, the waiting item of job $j$ is the sum of processing times of jobs before $j$ in the ordering. Our goal is return an ordering of the jobs such that the sum of waiting times is minimized. Describe an $O(n\log n)$-time algorithm for this problem, explain why it's correct, and analyze its running time.
Example: If $A = [5, 1, 2]$, then one ordering is $[5, 1, 2]$, and the sum of waiting times is $0+5+6 = 11$. But the algorithm should return $[1, 2, 5]$ because the sum of waiting times in this order is $0+1+3 = 4$.
The algorithm returns $A$ sorted in non-decreasing order. Intuitively, longer jobs should appear at the end, because that way, they contribute to the waiting time of fewer jobs. More formally, suppose that the optimal solution is not sorted in non-decreasing order, i.e., it processes some job $i$ before some job $j$ but $A[i] > A[j]$. Then swapping $i$ and $j$ (and leaving all other jobs untouched) causes $i$ to wait $A[j]$ more time and $j$ to wait $A[i]$ less time. This exchange does not change the waiting time of any other job, so its net effect on the waiting time is a decrease by $A[i] - A[j] > 0$. Sorting an array takes $O(n\log n)$ time.